Category Archives: Data

Navigating 2024: KEY Marketing Trends Set to Reshape the Landscape

Written by Matthew Biboud Lubeck, Vice President EMEA, Amperity

Research demonstrates that only 3 per cent of consumers feel in control of their data online. Yet trust is crucial to driving customer loyalty and growth. In fact, 43 per cent of people say they’d switch from their preferred brand to a second-choice brand if the latter provided a good privacy experience.

From the strategic adoption of zero-copy data practices to the intersection of customer experience aspirations with data management realities and the ever-evolving landscape of AI ethics and regulations, the year ahead promises to reshape how we approach these critical facets. Let’s dive into the practical implications and emerging trends that will redefine marketing strategies, offering valuable insights into the challenges and opportunities that lie ahead.

 

1. Zero Copy Data: Data Should Live in Fewer Places

The paid media and advertising landscape is poised for a transformative shift, driven by a recognition of the limitations in current practices. A key trend emerging in the coming year is the adoption of a zero-copy data philosophy. This approach signifies a strategic move towards centralising customer information and minimising data duplication across platforms.

Marketers will strategically embrace data minimisation in response to challenges posed by privacy regulations and consumer dissatisfaction with intrusive tracking. The industry will shift away from scattered and duplicated data sources, opting for a zero-copy data philosophy that prioritises efficient and non-redundant data access.

This transformation will extend to a comprehensive revamping of the data ecosystem, with a focus on aligning practices with the zero-copy data philosophy. Advertisers and data management platforms will reassess bidding language and communication protocols to ensure privacy and prevent data leakage.

Identity resolution will become a central focus, with advertisers seeking accurate and comprehensive first-party data to build unified customer profiles. This shift aims to address challenges arising from data deprecation and ensure a more reliable and consistent approach to customer identity.

In adopting these strategies, businesses can anticipate improved conversion rates, increased return on ad spend and reduced timelines, costs and risks associated with data management. Next year ushers in a paradigm shift towards zero-copy data practices, where businesses prioritise streamlined, non-redundant data access to enhance customer experiences and align with evolving regulatory and consumer expectations.

 

2. The Reality of the Personalisation Dream

In the coming year, we anticipate a growing realisation among brands that the aspiration for a seamless and personalised customer experience must be closely tied to effective data management and identity capabilities. While many brands may continue to showcase their “personality dream” through impressive customer journeys, we predict an increasing awareness that the actualisation of this dream hinges on addressing underlying data challenges.

Organisations will recognise the imperative to break down silos that separate data management, identity verification and customer experience teams. They will actively work towards integrating these components, acknowledging that a cohesive approach is vital for delivering the level of personalisation and efficiency customers expect. As a result, brands will invest more in modernising their data infrastructure, linking systems and training teams to utilise data accurately.

The shift will be from presenting an idealised version of customer journeys to actively resolving the practical challenges in data management and identity verification that often impede the achievement of a truly seamless customer experience. The coming year requires a transformation in mindset, recognising that the success of the “personality dream” is contingent upon addressing the intricacies of data, identity and customer interactions in a more integrated and strategic manner.

 

3. Embrace the AI Advantage (but don’t completely let go of the steering wheel)

In 2024, marketers must push the boundaries of AI adoption to improve the customer experience – scaling hyper-personalisation by integrating AI across ecosystems. Yet, many marketers continue to face a fundamental problem: How do they deliver a personalised experience to millions of customers?  And how do we explain the role of AI to consumers so transparency is maintained, especially in the context of the new European Artificial Intelligence Act (if you follow me on LinkedIn, you’ll know I VERY much support any legislation that improves digital accountability and ethical behaviours)?
AI will play a significant role in enabling hyper-personalisation by leveraging data analysis, predictive algorithms and machine learning to tailor experiences, offers and messaging to individual preferences and behaviours with efficiency and speed.

However, marketers need to know what questions to ask to reap the full benefits of AI. Therefore, “prompt engineering” will become a critical skill. This refers to deliberately crafting prompts or input queries to elicit specific responses or behaviours from AI models. Marketers should understand the capabilities and limitations of the AI model, and then tailor prompts to achieve their desired outcomes.

Of course, this can only be done effectively with accurate data. If the data feeding the AI is dirty or incomplete, marketers run the risk of receiving inaccurate insights that can impact their strategies and outcomes. By using AI’s predictive capabilities fed with accurate data, marketers can improve the customer experience and help brands collect and retain customers.

 

In this new landscape, marketers must also evolve their roles to become AI governors. This will allow them to become creative about applying AI throughout the marketing process while retaining checks and balances to remain accountable for AI-powered experiences. If done correctly, AI can help add significant speed, ease and improved performance across campaign and audience strategies. I predict we will increasingly lighten human intervention across more of these workflows, but not in the first few years of experimentation.

 

ABOUT THE AUTHOR

Matthew Biboud Lubeck, Vice President EMEA, Amperity

Matthew is the vice president of EMEA where he is responsible for the commercial expansion of Amperity, a leading customer data platform trusted by brands like Reckitt, Under Armour and Wyndham Hotels & Resorts. Lubeck joined Amperity in 2017 to help launch the company and has served in a number of key roles building sales, customer success, and marketing functions. Matthew established Amperity’s LGBTQ employee resource group (ERG) and is a trusted advisor and customer-centricity change agent to the C-suite across leading consumer brands.

Prior to Amperity, Lubeck spent 10 years with global beauty conglomerates Estee Lauder Group and L’Oréal as Group Head of Customer Data Strategy and Analytics, leading 30 brands across luxury, mass and salon professional divisions to better use data & unlock incredible beauty experiences, establishing L’Oreal as an industry leader. He resides in London with his husband and four-year-old daughter.

 

ABOUT AMPERITY

Amperity delivers the data confidence brands need to unlock growth by truly knowing their customers. With Amperity, brands can build a first-party data foundation to fuel customer acquisition and retention, personalise experiences that build loyalty, and manage privacy compliance. Using patented AI and ML methods, Amperity stitches together all customer interactions to build a unified view that seamlessly connects to marketing and technology tools. More than 400 brands worldwide rely on Amperity to turn data into business value, including Alaska Airlines, DICK’S Sporting Goods, Endeavour Drinks, Planet Fitness, Seattle Sounders FC, Under Armour and Wyndham Hotels & Resorts.

Looking to 2024: Data, AI and security will be top priorities for businesses

Written by Nathan Vega, Vice President of Product Marketing and Strategy at Protegrity

 The technology landscape has evolved significantly over the last year with the introduction of technologies such as ChatGPT and other generative AI tools taking the market by storm, while raising concerns about data security and more.  As we move forward into 2024, we anticipate that the impact these new technologies have made this year means they will continue to pave the way forward, with AI remaining a hot topic in the industry, while data security concerns rise around it.

 

Transparency

While data is considered the new oil, customers are going to expect more transparency from companies in terms of what they are doing with the data. Similar to Heinz Ketchup, which became a leading brand when it introduced a transparent bottle that allowed customers to truly see what was inside it, customers are going to expect a level of transparency from businesses when it comes to data.

Currently, companies are being forced to share details of what they are doing with customer data, and we expect to see more privacy regulations coming into effect to protect citizen data further. At the same time, we anticipate that companies will start to explore options for international data hubs that have been designed to meet stringent privacy laws to keep customer data safe.

 

Fragmented AI

There has been an impressive uptake in AI by businesses over the past year and, thanks to the likes of ChatGPT, many consumers today are using it as well. We expect to see the adoption of AI continue to grow in the year ahead. However, AI is currently quite fragmented and complicated, and we expect to see this changing in a similar way to cloud computing, starting out fragmented and simplifying over time.

A big issue for AI is the skills shortage. While the overall technology industry is facing a skills shortage, there is a major shortage of AI experts and talent in the industry. There will not be a quick fix, and this will hamper development in the AI space. At the same time, businesses are likely to experience the trough of disillusionment with AI and GenAI as companies grapple with the technology without realising its full potential.

 

Analysing and protecting data

Businesses are realising the value of analysing data, which has been made easier to extract with the help of AI. As such, companies will continue to invest in this technology and those that haven’t will be playing catch up. However, AI presents a challenge, in that privacy could be easily compromised as anyone with access to GenAI could extract the data too.

As such, companies will need to consider how they protect data from being accessed and used by unauthorised individuals, while at the same time giving those who need the data the required access. Companies are coming to realise the value of protecting data with solutions such as tokenisation, which keeps information segmented while giving access to the specific data that business units or individuals need to perform their jobs. In doing this, they are able to protect the most valuable data and minimise the risk of unauthorised users accessing data they shouldn’t.

With data breaches only set to rise in 2024 and beyond, this is of the utmost importance.

 

Disinformation and the impact of AI

While businesses are adopting AI with caution, attackers are adopting and using these technologies much faster and collaborating together to weaponise AI. This is where we are likely to see the biggest development taking place. As such, 2024 is likely to bring about breaches led by GenAI techniques either in the form of phishing emails, videos, doctored videos and images or even all of these combined.

At the same time, a clever use of data manipulation could damage data models, allowing for inaccurate predictions that could have a massive impact on a business or government entity. This data poisoning, which involves tampering with Machine Learning (ML) training techniques to produce undesirable outcomes, is going to be a growing concern for organisations that have a lot of data, as the more data you have the more likely it is that there could be bad data contained within it. This is another reason why companies will be turning to data protection tools to aid in data security.

 

Data, data everywhere

The pandemic might be behind us, but it changed the way we work forever. The gig economy has grown significantly, and this is likely to have some implications for business going forward, as skilled workers sell their time and access company data wherever they are, which will continue to impact the way in which we work.

Already innovation in hybrid work environments and being able to access whatever data and tools you need from wherever you are, has great appeal for many workers and they are unlikely to want to work for an organisation that does not at least offer a hybrid option. From a data perspective, this accentuates the need for it to be protected and for companies to implement solutions that meet regulation requirements across various territories to remain compliant, keeping commitments to secure customer and employee data and ensuring you have happy employees.

In addition to meeting customer expectations for data security and privacy or risk the chance of losing them, more and more companies will be investing in meeting compliance standards, while others will be fined for non-compliance to regulation standards such as PCI and DORA.

Data is going to either make or break businesses in 2024. As technologies continue to evolve, people will demand their data is secure and, as threat actors become more relentless, organisations will have to continue to go beyond the regulations and checkboxes to keep data secure. They will need to bring data security to the boardroom table, making it a key topic for discussion that focuses on data use and the protection of it for the best interest of their customers, employees and their business.

Aerospike Database 7 Delivers Enterprise Resiliency, Scale, and Speed to In-memory Database Workloads

Aerospike Inc. today released Version 7 of its real-time, multi-model database with a new unified storage format and other significant in-memory database enhancements.

The unified storage format in Aerospike Database 7 provides the flexibility to choose the right storage engine for different kinds of workloads, even within the same cluster. Developers no longer need to understand the intricacies of in-memory, hybrid-memory, and all-flash storage models. And in-memory deployments gain fast restarts for enterprise-grade resiliency and compression that shrinks the memory footprint of real-time applications.

“Enterprises traditionally turn to in-memory databases for sub-millisecond performance, but they are often brittle, slow to recover, and a hassle to manage and scale,” said Lenley Hensarling, chief product officer, Aerospike. “Now, any enterprise can deploy resilient in-memory applications on a modern, real-time database without compromising on speed, scale, or reliability — and use a fraction of the hardware and development effort of other offerings.”

One-third of Aerospike customers already use Aerospike for in-memory applications. The Aerospike Database optimizes performance with a unique hybrid memory architecture. Enterprises can choose in-memory, all flash, or hybrid storage to maximize performance and scale for a particular workload.

 

Major Aerospike Database 7 New Features

Major features of 7 include:

  • Unified Storage Format for all Storage Engines. A single storage engine and the same programming logic simplifies application deployment and operations.
  • Warm Restarts for High Resiliency. The new in-memory storage engine puts data in shared memory instead of process memory (RAM) enabling restarts in as little as a few seconds — instead of minutes or hours.
  • In-memory Compression Maximizes Memory Efficiency. Version 7 is the first database to enable LZ4, Snappy or ZStandard compression algorithms to gain the same compression ratios regardless of storage engine, saving on hardware cost.
  • In-memory Data Mirrored to Persistence Layer for Best Performance. When deploying in-memory with persistence, all operations like defrag, garbage collection, tomb raiding, and re-balance take place in memory without touching the drive except for mirroring the changes.
  • Deploy without Drives. In-memory namespaces can be deployed without a persistence layer by taking advantage of the Aerospike Shared Memory Tool (ASMT) to persist the namespace from shared memory to the file system after shutdown.

 

One Multi-model Database Simplifies Operations and Reduces Costs

The Aerospike Database handles diverse workloads across the most popular NoSQL data models — key value, document, graph, plus SQL access for analytics — in a single real-time data platform. Aerospike’s multi-model approach simplifies database operations and delivers low-latency, high-throughput processing across data models while handling mixed workloads from gigabyte to petabyte scale.

Operational benefits of Aerospike’s multi-model approach have resulted in proven, best-in-industry total cost of ownership (TCO), which shows that even at the highest levels of scale, Aerospike requires up to 80% less infrastructure than traditional key-value, document, or graph databases.

 

 

UK Data Experts Reveals How to Get Your Brand ‘Holiday Ready’

Written by Matthew Lubeck, Vice President EMEA, Amperity

The way consumers shop during the holidays has undergone a significant transformation in recent years. Not only are they starting earlier, but as the UK’s cost-of-living crisis continues, they’re also spending less across all categories. As brands shift into ‘holiday mode’, below are some critical insights to keep in mind for a successful and impactful campaign season.

Prepare for privacy changes

Privacy changes continue to impact the landscape. As it stands, the EU general data protection regulation (GDPR) remains the strongest privacy and security law in the world. In fact, just recently Meta and TikTok were fined big for violating its privacy laws.

Furthermore, in Q1 2024, Google will begin disabling third-party cookie tracking for 1 per cent of Chrome users. With 65 per cent market share, this is something advertisers cannot ignore – and as time goes on and cookies are further disabled, the impact will only get more pronounced.

Trends you need to know now:

  1. First-party data: Freely-given, first-party consumer data is now critical, not just for targeting, but also for attribution. Companies already turning to first-party data are running really clean AB tests and having the ability, without third-party data, to connect the data together on the back end for a clearer profile of each unique customer.
  2. Contextual Targeting: Marketing to a single person with static attributes is moving towards contextual targeting based on what people are doing at a given time rather than who they are in a more static sense.
  3. Data Clean Rooms: One of the biggest areas the continued deprecation of cookies will affect is measurement and attribution. Clean rooms are filling that hole – to tell advertisers if certain types of marketing were effective, drove incremental sales, etc.

AI: Your own personal Santa’s helper

Using generative AI in retail has been a big differentiator in creating propensity models for insights into why consumers purchase specific items – and how to promote more sales. Tasks that used to be very manual are now fed into AI engines to devise hyper-targeted messaging at scale.

Even more, if you don’t have enough holiday content to speak to different segments with different tones, using AI is a budget-conscious way to crank out a lot of content variations quickly and with minimal effort.

Whether your advertising strategy is operating at a baseline (one-to-one communications) or a more advanced level (AI-driven), you can still prepare and succeed.

Understand and address the challenges you’re faced with today to:

  • Get ahead of things like lead times to pull specific lists
  • Develop the right copy to execute on those campaigns
  • Pre-test systems to prevent outages
  • Find the solutions to better know your customers and present them with relevant offers at the right place and time.

Balancing holiday demands

It’s crucial to have the agility to target and act between Black Friday and close of fiscal. Often in retail, we need to move fast to see what’s going on under the sales – if they are soft or even if they’re strong – and decide what levers to pull or to hold back on.

Ask yourself:

  • Which holiday shoppers were buying last year and why they haven’t shopped yet?
  • Who are your ‘gifters’ and have you seen them?
  • How can you design and target promotions to maximise basket size of different types of spenders to push them higher?

 

And last one – how hard is it to do this at the speed of the holidays without ruining a perfectly great ugly sweater party because you’re stuck at the office trying to run analytics jobs or waiting for data?

 

Checking your customer data wish list

You’ll collect a lot of consumer data and need to put it to work. Consider where you sit on the maturity curve and what types of initiatives you can put on your roadmap early in 2024 to ensure you’re stepping forward as an organisation.

  • Baseline: Develop core buyer personas and begin defining initial segmentation
  • Basic Targeting: Improve operational process to generate customer segments
  • Cross-Channel: Test and analyse segment performance across channel mixes
  • 1:1 Personalisation: Begin operationalising personalised recommendations and next best action engines
  • AI-driven Activation: Incorporate broader range of touchpoints into decisioning (i.e. product reviews, customer care surveys, return rates)

 

Out with the old

It’s time to expect the unexpected. Old-school holiday campaign timing and targeting simply don’t work anymore in light of new consumer behaviours. Customers are buying seasonal items earlier, focusing more on necessities rather than sale-priced luxury items and buying up-trending items shared as pairings or group displays.

Disruptive sales events like Amazon’s Prime Day are demonstrating some of those behaviours. The eCommerce giant had huge numbers on Prime Day 2023, with sales reaching an estimated £10 billion worldwide – a 6 per cent increase over last year. This indicates a positive direction for the economy, which is great news for advertisers. But it also stems from smart planning.

Below are three tactics Amazon used along with my recommendations for your holiday strategies:

1. Invitation-only deals: Personalised, advanced offers based on consumers’ previous purchase histories.

  • Full access to every consumer’s unique buying profile will be key to tailoring unique communications like these.

2. Best value vs. lowest price offers: More consumers purchased essential items versus lower-priced luxury ones.

  • With smaller budgets, consumers are shopping for necessities – and having the insights to target them in the right place at the right time will be critical.

3. Pairing of orchestrated offerings: Amazon saw a large uptick around its curated lists and specifically strong product trend. Think of the mass appeal across demographics related to Barbie movie merchandise – Dad buys an “I’m Kenough” t-shirt, Mom nabs a hot pink jacket and the kids get a couple of “Weird Barbie” dolls.

  • Staying on top of emerging trends and offering easy access to grouping suggestions could be a gamechanger for you, too.

Don’t delay – ‘sleigh’ the holiday season with technology

There are a lot of basics that organisations can do right now to get ahead of the holidays and 2024 planning. However, don’t stall. If you wait until mid-November to start, you and your customers will come away empty-handed.

So ask yourself, do you have what you need to make these strategies easier? Or does your current tech just maintain the status quo? The goal here is to move up that maturity curve to put yourself in a better position than you may be now.

Pro tip: Using a customer data platform that merges all your physical and digital data within unified customer data profiles by using zero, first and second-party data (and even third-party data sources as a supplement where appropriate) is a critical way to stay current, keep sales moving in a positive direction and predict best ways to campaign into the future. If you haven’t already, look into it now.

To learn more about ways to prepare, plan, and execute for a successful Black Friday and 2024, check out Amperity’s recent webinar: Countdown to Black Friday: Strategies to Win Holiday 2023.

 

About the author

Matthew Lubeck is Vice President EMEA, Amperity

Not So Remote: The Large Possibility of Data Loss When Workers Are Offsite

Written by Mark Johnson, Senior Director of Global Alliances, Arcserve

As we all know, the COVID pandemic triggered a rapid and widespread shift to remote work that persists today—and for good reason. Remote work offers many benefits. It gives employees greater flexibility over schedule, eliminates commute, enhances health and happiness, and boosts productivity. There are also downsides, of course. For one, the rise of remote work introduced a host of cybersecurity concerns, as employees who work outside the office don’t have the security features that go with it.

Are companies responding? Many are not. For example, only 38% of financial services companies have a backup and recovery solution for remote employees, according to an eye-opening new study from Arcserve. It is a risky proposition. Financial services companies that don’t have a security solution for remote employees expose themselves to a wide range of serious threats, including data loss, regulatory noncompliance, and operational disruption.

Of course, the situation may not be as dire as that 38% number suggests since much of the data held by financial institutions probably doesn’t exist on employees’ devices. Most of it resides on secure servers, either onsite or in the cloud. Indeed, employees aren’t even allowed to save files to their desktops at many financial companies. They must log into remote virtual desktops or save data on SharePoint or OneDrive, both cloud-based. Strict controls and monitoring capabilities are in place because the main goal is to have as little data as possible on individual devices.

Where remote backup falls short

The financial sector is in pretty good shape regarding security. But if we consider remote workers outside the financial services sector, where regulations and security measures are typically less stringent, we can identify serious issues.

In many industries, individuals are allowed to save data to their devices, which poses a problem. Backing up remote workers is always a challenge because, in many cases, organisations aren’t managing the devices themselves. Instead, they focus on controlling access to company or web-based resources like Microsoft 365, NetSuite, and Salesforce.

Most companies don’t secure remote devices adequately, and even when they try, protecting those devices is inherently difficult because they’re constantly on the move. Traditional backup and recovery methods often fall short when the laptop is unavailable or offline. Not all backup and recovery software solutions automatically resume backup when the device comes online, resulting in days of data loss.

That’s why, with the rise of remote work, the goal is to store as much data as possible in the cloud or corporate servers and reduce reliance on individual devices. This way, if a laptop is lost or damaged, data remains accessible. Like transferring data to a new phone, the idea is to enable users to log in and access their data seamlessly. However, achieving this goal is challenging due to the offline nature of remote work.

Four keys to protecting remote data

In the context of remote working, here are four key lessons and security best practices that organisations can learn from and adopt.

1: Centralize your data

While some individuals may have copies of data on remote workstations, the goal should be to centralise data on corporate servers or cloud-based solutions like Office 365, Salesforce, NetSuite, or a similar platform.

2: Secure remote devices

Whether it’s a laptop, iPad, or home computer, all devices serve as a gateway to corporate systems. For instance, the laptop on my desk is the portal through which I access applications like Salesforce and Office 365. The goal is to enhance the security of these devices. The approach should include robust endpoint security, strong authentication, and regular updates to protect devices from malware, unauthorised access, and known vulnerabilities.

3: Train your users

It’s worth emphasising that many vulnerabilities arise from compromised user credentials. These credentials aren’t limited to super admins or high-level staff. Even regular user credentials can be exploited to wreak havoc, especially on platforms like Microsoft 365. Ensuring that your users are well-informed and vigilant is crucial. Ultimately, organisations should aim to foster a culture of security where every employee understands their role in maintaining cybersecurity and feels responsible for protecting sensitive data and systems.

4: Continuously update your policies and procedures

This best practice applies to security, backup, recovery, and user access policies. These must be updated regularly to keep pace with changes in your environment, such as introducing new applications. For instance, if you look at some major ransomware incidents, many were executed using outdated credentials. When employees leave a company and their credentials aren’t promptly revoked, it creates a vulnerability. Falling behind in policy updates can lead to mismatches between your policies and the data you must protect, whether for backup or security purposes.

Final takeaway

The global shift to remote work has brought both opportunities and challenges. While remote work offers enhanced flexibility and adaptability, it has opened up cybersecurity threats and increased data vulnerability. Considering these issues, there are overarching lessons and high-level security principles that all organisations should heed. These are especially important now that we live in an era where remote work is not merely a temporary response to unforeseen circumstances but an enduring and integral component of the contemporary organisation.

 

Nutanix Strengthens Cyber Resilience with Accelerated Ransomware Detection and Recovery

Nutanix has today announced new features in the Nutanix Cloud Platform to strengthen organisations’ cyber resilience against ransomware attacks on unstructured data. These new features, available today in Nutanix Data Lens™ and Nutanix Unified Storage™ solutions, enable organisations to detect a threat, defend from further damage and begin a 1-click recovery process within 20 minutes of exposure. The features build on the strength of Nutanix Cloud Platform to protect and secure customers’ most sensitive data across clouds.

Ransomware is a top priority for CIOs and CISOs globally, yet 93% of organisations report they need to be better prepared according to the Enterprise Cloud Index. Speed of detection is more critical now that the average ransomware attack duration accelerated 94% as threat actors become more efficient at breaching, exfiltrating, and enacting a ransomware payload compromising data. Fast data recovery is also essential since recovery can typically take days or even weeks, and incomplete recovery can impact operations long after the attack is over.

“Rapid detection and rapid recovery are two of the most critical elements in successful ransomware planning, yet remain a challenge for many organisations especially as they manage data across multiple clouds,” said Scott Sinclair, Practice Director with the Enterprise Strategy Group. “Nutanix Data Lens and Nutanix Unified Storage, Nutanix Cloud Platform now provides a 20-min detection window and 1-click recovery, with cyber resilience integrated at the unstructured data layer to simplify cyber resilience while accelerating both detection and recovery.”

 

Nutanix Data Lens is a SaaS-based data security solution that helps proactively assess and mitigate unstructured data security and compliance risks by identifying anomalous activity and auditing user behaviour. New capabilities include:

  • Ransomware Detection and Blocking within 20 Minutes: Fast detection combined with automated response helps quickly block attacks, thus minimising the overall impact of ransomware. This provides an extra layer of security to protect an organisation’s unstructured data, reducing data damage.
  • Ransomware 1-Click Recovery: Nutanix Data Lens and Nutanix Unified Storage will identify the last known good snapshot and will automatically recover the share from the snapshot. Customers will have the option to leverage an automated recovery or a manual, guided recovery to quickly restore normal operations.
  • Permission Visibility and Risk Visualisation: This enables customers to better understand complex permission structures, audit configurations, and better assess risks. Ensuring data access is aligned to business needs is critical to minimise data loss, support regulatory compliance, and reduce the impact of attacks.

“Many organisations struggle with managing data protection across storage silos and clouds, especially when it comes to data governance and security,” said Thomas Cornely, SVP, Product Management at Nutanix. “With these new ransomware detection and recovery features, the Nutanix Cloud Platform provides built-in ransomware protection, data visibility and automated data governance for Nutanix Files and Objects across clouds to simplify data protection and strengthen an organisation’s cyber resilience posture.”

Nutanix Data Lens Now Supports Nutanix Objects

Customers using object storage will now have the same intelligence understanding and forensics available across both Nutanix Files™ and Nutanix Objects™ solutions. The advanced auditing and forensic capabilities in Nutanix Data Lens now extend to Nutanix Objects. This enables customers to reduce security and regulatory risks and further simplifies their path to integrated data management across clouds.

These new capabilities build on the Nutanix Cloud Platform’s natively integrated networking and security across clouds. Nutanix Cloud Platform delivers built-in cyber resilience capabilities including automated platform hardening with 1-click data encryption, secure network policies, application microsegmentation, and ransomware data protections. This strengthens an organisation’s overall security posture for data and applications across on-premises, public clouds and edge.

Nutanix customers shared:

  • “Understanding access to our data is very important for us to ensure data is secure, safe, and being used properly,” said Robert Pohjanen, IT Architect, LKAB. “Tools like Data Lens give us the insights we need to understand who is accessing our data, if it’s appropriate access, or if there is an attempt to misuse or attack our data. The forensics and the new permissions and access risk views are important tools to keep our data safe from malicious users, or from threats such as ransomware.”
  • “We suffered a ransomware attack, in which our entire legacy infrastructure and our backup were compromised,” said Brunno Amado da Silva Vieira, Senior IT Infrastructure Analyst, Unimed Belém Medical Work Cooperative. “Thanks to Nutanix technology, we were able to recover the most critical applications in a short time, to minimise the impact. Nutanix ended up saving everything, and made us decide that from then on, we will leave the legacy technology and migrate all workloads to Nutanix.”

New features are currently available to customers. More information is available here.

The Power Of Digital Asset Management: How It Helps Workflows

Businesses have increasingly become dependent on a variety of digital assets, such as videos, images, audio clips, presentations, and various document types, for their daily operations. Consequently, quick and efficient access to these resources has also become necessary.

In this context, the conversation often gravitates toward the debate between DAM vs. document management, as each system presents distinct advantages in managing different business resources. Digital Asset Management (DAM), however, stands out for its specialized ability to comprehensively manage a diverse array of digital assets, streamlining the processes of storing, organizing, retrieving, and distributing them. 

This article elucidates how incorporating a DAM system into a business’s operational structure can significantly enhance workflow effectiveness and elevate overall productivity. 

  • Improved Collaboration

DAM systems offer a platform where all team members can access, collaborate, and work on digital assets, regardless of their geographical location. This instant resource access eliminates the need for tedious email threads and file sharing via external devices. Moreover, by providing version control and audit trails, DAM ensures everyone works on the most updated asset, reducing redundancies and miscommunications. 

Furthermore, DAM promotes cross-departmental collaboration. Different teams can use the same resources without interfering with each other’s work. This can be instrumental in maintaining consistency across various projects and tasks while allowing a clear understanding of how different assets are being used within the organization. 

  • Centralized Storage

One of the fundamental advantages of DAM is that it provides a single, centralized location to store all digital assets. This centralized repository ensures that every piece of digital content, regardless of format or size, is organized and easily accessible. This eliminates the risk of losing data due to misplaced or misnamed files, enhancing workflow efficiency. 

Moreover, a centralized storage system makes sharing and distributing digital assets easier across various channels. Be it a social media post or a marketing campaign; the relevant assets can be located and utilized swiftly and effectively, making the process seamless and efficient. 

  • Enhanced Security

Digital assets are valuable business resources requiring stringent security measures to prevent unauthorized access and use. A DAM system offers robust security features like access control, watermarking, and encryption to protect assets from misuse or theft. These measures secure the assets and regulate who can access what, allowing the administrators to maintain control over the assets. 

In addition, DAM systems have backup and disaster recovery features. These prevent data loss due to system failures, natural disasters, or human error. Thus, organizations can be assured that their assets are well-protected and readily available, adding an extra layer of confidence to their workflows. 

  • Streamlined Workflows 

Through DAM, businesses can establish an orderly workflow for creating, approving, and distributing digital assets. It eliminates bottlenecks in the creative process by allowing multiple users to work on an asset simultaneously and streamlining the approval process with built-in workflows. This significantly reduces turnaround times and improves overall efficiency. 

Additionally, DAM can automate repetitive tasks like renaming files, converting to different formats, or assigning metadata. This allows the team members to focus more on creative tasks, enhancing output quality and increasing productivity. 

  • Brand Consistency 

By providing a centralized repository of brand-approved assets, DAM systems ensure that all teams within an organization use consistent, up-to-date assets. This means that the brand messaging remains consistent no matter where the business communicates with its audience—whether through a social media post, an email campaign, or a print ad. 

Moreover, DAM systems can be configured to automatically update assets across all platforms whenever changes are made to the central repository. This feature ensures that changes to brand guidelines or asset designs are immediately reflected across all platforms. It streamlines the process of maintaining brand consistency and reduces the risk of outdated or off-brand content being used. 

  • Compliance And Legal Protection 

DAM systems can help businesses comply with legal regulations and copyright laws. They can manage and track licenses for digital assets, ensuring that the use of assets is always within legal boundaries. This is particularly important for businesses operating in industries with stringent regulatory requirements. 

Moreover, DAM systems can store crucial information such as usage rights, model releases, and copyright information. This ensures all team members are aware of any restrictions associated with a particular asset, minimizing the risk of copyright infringement and potential legal issues. 

  • Cost Efficiency 

Investing in a DAM system can result in significant cost savings over time. By reducing the time spent searching for assets, eliminating unnecessary asset duplication, and automating manual tasks, businesses can save time and money. The increased efficiency and productivity can ultimately lead to improved profitability. 

Furthermore, a cloud-based DAM solution can reduce the costs associated with maintaining physical servers and IT infrastructure. This, combined with the scalability of cloud-based solutions, makes DAM a cost-effective option for businesses of all sizes. 

Conclusion 

Digital asset management is a powerful tool for businesses looking to harness the full potential of their digital assets. By promoting improved collaboration, enhancing security, streamlining workflows, providing centralized storage, encouraging better asset utilization, facilitating compliance, and offering cost efficiency, DAM significantly improves workflows and increases overall productivity.  

As the digital landscape evolves, the importance of effectively managing digital assets cannot be overstated. Thus, implementing a DAM system could be a game-changing strategy for businesses aiming to thrive in the digital age.

 

 

Amperity Named 2023 Databricks Built on Partner of the Year

Delta Sharing solution helps brands unify data to drive accurate insights and segmentation to deliver personalised customer experiences

Amperity, the leading enterprise customer data platform (CDP) for consumer brands, today announced it has been awarded the 2023 Built on Partner of the Year Award by Databricks, the Data and AI company. With Databricks and Amperity, brands such as Caleres, a global footwear company, have maximised the value of their customer data, minimised costs, and increased data democratisation to generate and send insights to their downstream systems.

The award was presented recently at Databricks’ Data + AI Summit 2023 and underscores the impact Amperity has made in developing Databricks competency and helping to solve their customer data challenges and break into new revenue streams.

“Amperity has enhanced the way brands identify, understand, and connect with their customers by leveraging AI to deliver a comprehensive and actionable Customer 360,” said Steve Sobel, Global Industry Leader for Communications, Media and Entertainment at Databricks.  “Amperity plays an important role in our ecosystem as we accelerate our composable CDP practice and help organisations unlock the elusive unified customer profiles to improve marketing performance, fuel accurate customer insights, and enable world-class, real-time customer experiences on the Lakehouse.”

With the rise of the modern data stack, Databricks and Amperity offer a future-proof approach to behavioral data creation, data storage and management, personalisation and digital experiences. Amperity applies patented machine learning methods to resolve customer identity and standardise all digital and offline interactions to build a unified customer profile. This enables Amperity to correctly identify every customer so that the data foundation Databricks uses in its predictive models are more accurate and deliver better outcomes.

“We are extremely honuored to be recognised as the Built on Partner of the Year by Databricks. Together, we’ve had the privilege to see the outsized value our joint customers have achieved by composing our unique capabilities together,” said Derek Slager, co-founder & CTO at Amperity. “We are excited to continue building on this partnership, applying the latest innovations in generative AI to our unified customer data foundation to drive the next generation of customer personalisation and business value.”

For more information about Amperity’s partnership with Databricks, visit our website.

 

About Amperity

Amperity delivers the data confidence brands need to unlock growth by truly knowing their customers. With Amperity, brands can build a first-party data foundation to fuel customer acquisition and retention, personalise experiences that build loyalty, and manage privacy compliance. Using patented AI and ML methods, Amperity stitches together all customer interactions to build a unified view that seamlessly connects to marketing and technology tools. More than 400 brands worldwide rely on Amperity to turn data into business value, including Alaska Airlines, DICK’S Sporting Goods, Endeavour Drinks, Planet Fitness, Seattle Sounders FC, Under Armour and Wyndham Hotels & Resorts. For more information, visit amperity.com

 

Scottish Government Agency Leverages Nutanix Cloud Clusters (NC2) to Embrace Cloud Technology

Nutanix today announced that Forestry & Land Scotland (FLS) has upgraded its datacentre infrastructure to a hyperconverged infrastructure (HCI), selecting the Nutanix Cloud Platform to support a workload of 300 virtual machines. FLS opted for Nutanix Cloud Clusters (NC2) on Microsoft Azure, a hybrid cloud solution that functions as a single cloud, allowing users to manage apps and infrastructure in their private cloud and Azure. With Nutanix NC2, FLS has been able to migrate the whole datacentre to Microsoft Azure without the time, effort and expense of re-engineering applications for native deployment

Founded in 2018 as part of the Scottish devolution process, FLS manages over 1.5 million acres of national forests and land. That’s close to 9% of the Scottish land mass with FLS having a wide remit to promote and manage, not just forestry but tourism, leisure, nature conservation and other related activities across the area. To meet the short term IT needs of a newly devolved Scottish government agency whilst, at the same time, supporting its move to the public cloud in line with a cloud-first government policy, FLS was required to rapidly revamp its legacy on-premises datacentre.

Nick Mahlitz, Senior Digital Infrastructure Manager, FLS

 

“We initially saw NC2 as a kind of stopgap,” said Nick Mahlitz, Senior Digital Infrastructure Manager, FLS, “that would allow us to pick up our on-premise Nutanix datacentre and run it on Microsoft Azure while we went about re-engineering applications to run on that platform natively. However, it soon became clear that NC2 could be a lot more than a halfway house. In fact it could deliver many, if not all, of the benefits of public cloud without the time, effort and extra expenditure required for full native migration.”

FLS was already using Microsoft Azure to provide for disaster recovery of its on-premise datacentre, so, naturally, the organisation first looked at re-engineering for native operation of its applications on that platform. FLS soon realised that NC2 for Azure would be a better, quicker and more cost-effective approach, enabling it to stretch its existing environment seamlessly into the cloud and migrate workflows at its own pace without having to transform or re-engineer the code in any way. The migration to the Nutanix Cloud Platform offered immediate benefits in terms of both performance and on-demand scalability. It also resulted in a significantly smaller datacentre footprint in terms of both physical space and power and cooling requirements – yet another key benefit of value to this environmentally focused organisation.

As with the original datacentre project Mahlitz and the team were faced with a lot of unknowns when it came to coming to this conclusion, but here, too, Nutanix was able to help. In particular, by arranging a proof of concept trial of Nutanix NC2 on Microsoft Azure involving actual FLS production workloads.

 

Colt successfully completes the deployment of fibre network infrastructure along the Channel Tunnel

The new fibre-optic cable deployed along the Tunnel provides access to the highest number of data centres fibre-connected between the UK and mainland Europe

Colt Technology Services (Colt), the digital infrastructure company, today announced the successful completion of the deployment of a new dark fibre (G.652D/G.657) cable along the Channel Tunnel connecting London (UK) to Paris (France).

Colt’s new fibre network will provide businesses and organisations with faster, more reliable, and uninterrupted data connectivity across borders. It will also support the growing demand for cloud computing and other digital services in both London and Paris, with a seamless end-to-end SLA.

Responsible for the day-to-day operations of all new cables installed in the Channel Tunnel, Colt boasts a transfer rate of several Tbps per fibre pair with the new infrastructure. With this evolution of the infrastructure, customers will have access to their 100Gbps/400Gbps Colt IQ Network, supported by reliable low latency, high-bandwidth connectivity and service guarantees through the tunnel. The tunnel offers the best path to close a key network loop that runs between London, Paris, Brussels, and Amsterdam.

In accordance with the Channel Tunnel’s safety and security regulations, all connectivity services will be highly secure. They will offer significantly more capacity than subsea cables and will be placed in a highly protected environment that is not vulnerable to piracy, ship anchors, commercial fishing, or shipping. The tunnel has had no accidents since its installation in 1994.

“We are pleased to announce the successful completion of the next step in this important project,” said Herve Jost, Director of Eurotunnel/Getlink Connectivity Solutions, Colt Technology Services. “Approximately 80% of the internet traffic which runs between the UK and mainland Europetoday travels through the Channel Tunnel. It is crucial to increase the bandwidth, enhance performance, bolster reliability, and fortify the security of our network. This is vital to ensure a continuous and uninterrupted flow of data, specifically to address peak demand and new bandwidth needs within the Channel Tunnel, driven by new technologies such as AI and Metaverse. Our new fibre cable in the Tunnel will transform the subterranean rail link between the UK and mainland Europe into a major data route to support the anticipated growth in traffic over the coming years.”

In September 2021, Colt and Getlink signed a 25-year exclusive ‘concession’ contract to install and operate a new fiber optic network spanning the Channel Tunnel, using the latest connectivity technology to establish seamless connectivity between the UK and mainland Europe. This 25-year agreement is bringing new dark fiber and data services to the Channel Tunnel for the first time in a generation.