Tag Archives: Data management

FactSet and Alveo Collaborate to Help Clients with Last Mile Data Integration

Following the successful integration of FactSet’s ESG content into Alveo’s data management platform, FactSet and Alveo are pleased to announce a collaboration that combines their respective data and data management capabilities to provide customers with integrated solutions that more easily puts FactSet content into clients’ workflows and databases.

The collaboration will minimize the required time needed to onboard new data sets and business applications to help customers optimize the full range of FactSet’s data. This includes FactSet’s ESG and wider corporate actions services, as well as FactSet’s Concordance for security identifiers and legal entity recognition.

With the help of Alveo’s data management solutions, FactSet content can be cross-referenced and linked with client data sets or content sourced from other data vendors within a client’s business process.  Additionally, the collaboration will provide the capabilities for data lineage, data governance, data cleansing, and cloud sharing delivery and integration into customer’s workflows and cloud data warehouses.

Jonathan Reeve, SVP, Head of Content and Technology Solutions at FactSet, said: “We are pleased to work with Alveo’s cloud-based data management so that clients can cross-reference and integrate our content and blend it with their internal data to slash the time required to onboard new data sets, reports, and business applications, as well as expedite the incorporation into their business workflows.”

Alveo focuses on the integration of market and reference data from multiple external and internal sources covering all asset classes including pricing information, referential information, ESG data and issuer, and corporate actions data. Alveo’s cloud-based data mastering and data quality solution tracks the collection, integration, and quality-vetting of a diverse set of content providers. Concurrently with collaborating on integrated solutions, Alveo has extended its set of off-the-shelf integrations with FactSet content including corporate actions data.

Mark Hepsworth, CEO, Alveo, said: “Combining data and data management into a Data as a Service offering that enables clients to more easily onboard new data, is we believe a growing requirement across the industry.  Clients have a lot going on within their internal ecosystems and we need to enable the integration of data to be as simple as possible. We are very pleased to work with FactSet on this initiative. They have a wide range of content and data management capabilities that we can further extend by combining our solutions within the cloud.”

About Alveo

Alveo is the leader in market data integration and analytics solutions for financial services. Focused on optimizing data flows for business user self-service, we provide cloud-native data aggregation and data quality management that enables clients to easily access trusted data while maximizing their data ROI.

Through our managed services, we ensure that clients can smoothly onboard, prepare and validate data for use in operations, trading, investment management, pricing, risk, reporting and machine learning.

We service a global client base, and our award-winning technology provides easy integration into business user workflows and a proven platform for advanced analytics. Through combining deep domain expertise with the latest open-source technologies, we help financial institutions ensure high-quality data, optimize market data cost and maximize productivity.

Dufrain secures investment from Phoenix Equity Partners

Dufrain, the market-leading data management services and data analytics consultancy, today announces that it has secured investment from Phoenix Equity Partners (“Phoenix”), a UK growth-focused private equity firm, to fuel the next stage of its growth as part of a management buyout.

Founded in 2010, Dufrain helps its financial services and banking clients discover, manage and optimise their data to gain valuable insights and make better informed business decisions. The business employs more than 170 data experts, providing a comprehensive set of data management solutions from its offices in Edinburgh, Manchester and London. Dufrain’s strategic approach and reputation for service excellence has seen it become a trusted partner to clients of all sizes across the banking and insurance sectors, among others.

Dufrain has delivered year-on-year revenue growth of 40% over the past two years. Phoenix’s investment will allow the business to continue this growth trajectory by expanding its footprint across multiple geographies and sectors as well as investing in its inhouse technology solutions.

 

Joseph George, CEO of Dufrain, commented:

“Every organisation faces challenges in terms of extracting value from the data they have or could have available. This management buyout will enable the leadership team to continue to build and invest in the business as we pursue our ambitious growth plans. We look forward to drawing on Phoenix’s expertise to help us achieve our plans over the coming months and years.”

 

Chris Neale, Partner at Phoenix, said:

“The data services market is large, global and growing quickly. Dufrain is perfectly placed to capitalise on this as the leading independent data solutions provider in the UK. Over the years that we have been following Dufrain we have been impressed by its fantastic culture, strong client relationships and first-class quality of work. We are very excited to have the opportunity to partner with Joseph and the team to help achieve our shared vision.”

Redgate Software Adopts Policy-Driven Approach to Data Protection with New Data Catalog Release

Cambridge UK, Thursday, 26 May – In a move to help businesses simplify their data management practices by automating policy decisions, the latest release of Redgate Software’s SQL Data Catalog now provides a simple, policy-driven approach to data protection. As well as automatically scanning columns within databases and using intelligent rules to make recommendations about how they should be classified, it auto-generates static data masking sets from the classification metadata that can be used to protect the databases.

This is a timely move because many organizations, like those in the Financial Services and Healthcare sectors, are now obliged by law to ensure that all sensitive or personal data is identified and removed or protected before databases are made available for use in development, testing, analysis, training or other activities.

This is not a one-off exercise, but an ongoing effort that requires a continuous approach to data protection, typically involving three steps. First, organizations need a data protection plan to identify and classify which databases hold data that needs protecting and how. They then need to implement the plan in a way that guarantees sensitive and personal data is always removed or obfuscated by a method like masking database copies that are used outside secure production environments. And thirdly, the plan has to be maintained on a rolling basis as databases and their data expand and change.

As Bloor states in its Sensitive Data Management Spotlight paper: “This must all be done continuously. When new data enters your system, you should be automatically determining if it is sensitive, anonymising it if it is, and applying access rules as appropriate. This is most easily done via (automated) policy management, thus allowing you to manage incoming sensitive data indefinitely and thereby make sure that your organisation doesn’t relapse into noncompliance.”

This is a big challenge for many organizations, reflected in the scale of the research, resources, planning and time it requires. It’s a hard process to get right first time and impossible, without automation, to get it right every time data is created or refreshed in a non-production system.

SQL Data Catalog v2 marks a step change in this process by significantly reducing the time it takes to go from identification and classification to protection, and making maintenance far simpler. When connected to a SQL Server instance, it automatically examines both the schema and data of each database to determine where personal or sensitive data is stored.

An extensive set of built-in classification rules, which can be customized to align to particular regulatory requirements, then speed up data classification with automatic suggestions and intelligent rules based on automated data scanning. This identifies which columns need to be masked, either manually, or by using a tool like Redgate Data Masker which can sanitize the data using the auto-generated data masking sets provided.

Importantly, as databases are added, and existing databases modified, the data classifications are automatically maintained in SQL Data Catalog, and the data masking sets it creates can be updated on demand.

This policy-driven approach enables organizations to ease and streamline their data management processes by automating and maintaining their security posture. This protects their sensitive data, puts in place an auditable workflow, and ensures they stay compliant with regulatory agencies.

Spectra Logic Strengthens Tape Storage Leadership Position with LTO-9 Availability for its Family of Tape Library Solutions

With LTO-9, Spectra TFinity Tape Library Stores an Exabyte of Uncompressed Data Making it the World’s Largest Data Storage System

 Spectra Logic, a leader in data management and data storage solutions, today announced the availability of the ninth generation of LTO technology for its family of tape library solutions, including the Spectra TFinity ExaScale, T950, T950v, T680, T380, T200, T120, T50e and Spectra Stack. These tape libraries support LTFS, WORM, hardware encryption, and Spectra’s exclusive Media Lifecycle Management.  Certified with all major ISV packages, Spectra’s tape libraries seamlessly integrate with Spectra’s BlackPearl® Object Storage Platform, enabling users to create a multi-storage ecosystem for maximum storage, access and preservation of data to disk, tape and cloud. To learn more about LTO-9, register for Spectra Logic’s upcoming webinar, “Unleashing the Power of LTO-9,” on 23rd September 2021, at 4 p.m. BST.

“The industry is seeing a resurgence of tape storage deployments by organisations with growing data repositories, even if they already have disk and cloud, because tape provides the greatest storage capacity at the lowest cost per terabyte with exceptional reliability,” said Christophe Bertrand, senior industry analyst at Enterprise Strategy Group. “Tape technology, such as LTO-9, also offers unmatched air gap protection against ransomware, enabling organisations to protect their data offline and beyond the reach of malevolent threat actors.”

The new ninth generation of LTO (Linear Tape-Open) tape technology is designed for even greater data storage density and speed, providing up to 18TB of native capacity per cartridge (up to 45TB with compression*) surpassing the capacity of LTO-8 by 50 percent. Full-height LTO-9 drives deliver uncompressed transfer speeds of up to 400 MB/second, almost two times faster than current hard disk drives. In addition, LTO-9 drives are backward compatible with LTO-8 tape media, allowing users to read/write LTO-8 media.

“Today’s news of LTO-9 tape availability confirms the extensibility of this popular open format that was developed more than 20 years ago,” said Nathan Thompson, CEO and Founder of Spectra Logic. “With each generation, new features have been introduced, such as WORM, LTFS and hardware encryption. We recently made software updates for our installed base of thousands of LTO libraries to enable customers to add or upgrade to LTO-9. And since we are the first to offer a tape library – the TFinity — that can store an astounding exabyte of native data with LTO-9 drives and media, we couldn’t be more excited about the long-term prospects of LTO tape technology for our customers.”

 

The World’s Largest Data Storage System with LTO-9

Designed to meet demanding backup, archive and data protection requirements, the Spectra TFinity Tape Library addresses exponential data growth in such industries as high performance computing, scientific research, media and entertainment, cloud storage, education, healthcare, finance and traditional IT. The Spectra TFinity provides organisations with the ultimate in capacity, performance, reliability and scalability at a much lower price point than other storage options.

 

Highlights of Spectra’s TFinity Tape Library Include:

  • Tri-Media – The only tape library to support three tape technologies in the same library (LTO, IBM’s TS 11xx and Oracle’s T10000)
  • Scalable from 3 to 45 frames, over 56,000 slots, with bulk load capability
  • Advanced dual robotics for higher performance and greater reliability
  • Maximum compatibility to work with any third-party software package
  • Custom front library panels that showcase customer brand

 

Spectra LTO-9 Certified Tape Media

With today’s news, Spectra extends its Certified Media process to support and improve the characteristics of LTO-9 media. Spectra Certified Media undergoes a rigorous inspection and verification process to ensure that it is the highest possible grade of tape media on the market and comes with a lifetime guarantee. Additionally, Media Lifecycle Management ensures the safety of data by providing continuous assessments on more than 40 different metrics throughout the life of each tape. Detailed reporting mitigates any media problems and restore issues, allowing users to copy and move data onto new tapes should it be necessary.

 

*2.5:1 compression ratio

Resources:

 

About Spectra Logic Corporation

Spectra Logic develops a full range of data storage and data management solutions that help customers store, manage, preserve and use vast amounts of data to advance their strategic missions and increase value for their organisations. Dedicated solely to data storage innovation for more than 40 years, Spectra Logic empowers organisations worldwide to harness the power of their data with scalable and modern solutions to accelerate breakthroughs and success in the market. To learn more, visit www.Spectralogic.com.

How to Navigate the Cookie Apocalypse

The demise of third-party cookies has added layers of confusion to the data privacy debate; and when confused, consumer default is typically to disable and delete.

For consumers, the consequence is that they will need to recreate passwords and re-authenticate themselves. Previous engagement history with a brand may be forgotten and the relationship may go back to zero. For marketers, this raises the concern of how best to (re-)connect with their audience in order to create meaningful, unique content for each consumer. 

So, why are so few brands really engaging with their customers about the cookie apocalypse? Roy Jugessur, VP, EMEA, at Acoustic, says that the key is to cut through the confusion: make everything simple to enable marketers and consumers to move forward in a new ‘cookie-less’ state.

What is a Cookie?

We’re all used to seeing the notifications pop up on websites we visit, but does the average person really understand what a cookie is and how it works? Understanding this is the first step. Cookies are text files with small pieces of data, such as a username and password, which help to identify a specific computer on a network. They are essential to the modern internet, helping provide a more personal, convenient online experience by enabling websites to remember key information to enhance the user experience, for example user logins, shopping baskets, and previous browsing behaviour.

For marketers specifically, cookies are used to collect data in order to deliver the most relevant and targeted content, helping to customise every interaction along the customer’s journey. Utilising cookies can be an efficient way to reach target customers at scale, track digital behaviours, and reduce expenses because marketing activity is focused on the right audience. 

Data Privacy Concerns 

Regulations, including EU GDPR and the California CCPA, have made people significantly more savvy around data collection, management, storage, and deletion and the rules that should be in place around this. Since the introduction of GDPR three years ago however, the waters have become somewhat muddied, especially with Brexit and the COVID-19 pandemic meaning that public dissatisfaction over data sharing has increased.

A recent study found that 97% of consumers reported that data privacy is a concern, with 54% worrying about what companies do with their data. Additionally, Whatsapp’s need to clarify its updated privacy policy following mass public outcry, reveals just how much people are concerned about the misuse of their personal data. 

These calls for greater privacy controls have led to Google, Apple, and Microsoft, amongst others, phasing out the use of third-party cookies (and targeted adverts fueled by them) on their browsers because the data gathered hasn’t been offered to the company voluntarily by the consumer. Instead, as of 2023, brand websites will have to implement and use their own cookies to track user behaviour.

Moving Forward 

This customer-first attitude involves prioritising zero-party data (where customers intentionally and proactively share data with a brand) and first-party cookies, where a consumer provides consent on request, which companies can collect from their own website or other sources, for example their CRM system or surveys. 

Crucially, this data should be used to enrich the customer experience, with brands ensuring consumers get visible and tangible benefits from sharing their data. Using contextual analytics, marketers can still infer consumer behaviour and intent by analysing signals and patterns within both zero-party and first-party data to fuel the personalised experience. This is where MarTech can help to connect the brand with the right audience by utilising the relevant data they do have. Rather than focusing on a specific marketing channel like email, brands now need to consider the holistic customer journey and the many channels on which they can connect with consumers.

When putting this into practice, it’s essential that brands are transparent about how they are handling customer data and ensuring they give customers the choice to opt out. Asking permission and giving clear indications of how customer data is being used, is an essential part of using data ethically – a crucial step forward for data privacy. Marketers must also remember that even though this data has been given, it’s still not owned by the brand and therefore should not be abused. 

Conclusion

While many within the industry may lament the demise of the third-party cookie, forward-thinking brands and marketers should be embracing the opportunity to develop a customer-first approach to marketing activities—programs that are focused on how to get a relevant and meaningful message to the right person, at the right time, without compromising data privacy. There are exciting developments as alternatives to using third-party data including approaches that utilise automation and artificial intelligence, for example.

By embracing other targeted approaches and by-passing third-party cookies, choosing instead to focus on voluntary data from customers, brands will be able to develop better and deeper relationships with their audience. Additionally, choosing the right MarTech solution can support this shift in mindset and will lead to more dedicated and loyal customers. For brands and marketers alike, the cookie apocalypse doesn’t have to mean the world will crumble. Through focusing on data privacy and putting the customer first, brands will be able to move forward with confidence in the ‘cookie-less’ state.