Sanofi gains speed and agility with an Azure hybrid cloud strategy

Sanofi Gains Speed and Agility with an Azure Hybrid Cloud Strategy

Since its founding in 1973, Sanofi has relentlessly pushed the boundaries of medicine. With a mission to deliver life-saving medical treatments and vaccines to people everywhere, Sanofi relies on its cutting-edge science and manufacturing to make the impossible possible for millions of people around the world. When faced with the limitations of its on-premises infrastructure, Sanofi adopted a hybrid cloud strategy and chose Microsoft Azure as its cloud platform, gaining the speed, agility, and reliability necessary for innovation.

Azure gives Sanofi the ability to easily scale workloads up and down as needed in dynamic ways. 

Sam Chenaur: Vice President and Global Head of Infrastructure

Sanofi

https://play.vidyard.com/Axhm1iXTYC5y7kV4WTKBc7?disable_popouts=1&type=inline&v=4.3.14Transcript

A need for speed

As a global pharmaceutical company, Sanofi must follow stringent compliance, cybersecurity, and quality control requirements. To uphold those requirements, Sanofi has kept data and applications on-premises.

But on-premises infrastructures are costly, cumbersome, and can suffer from slow performance. Outages are another potential disadvantage to physical infrastructures, slowing the ability to launch products in a timely fashion. For instance, if a mission-critical app went down, more time and resources would be spent fixing the problem, rather than on research and development.

To deliver on its mission of transforming the practice of medicine, Sanofi needed a way to work faster and more efficiently while maintaining compliance requirements. The answer? A hybrid cloud strategy, which would allow Sanofi to benefit from cloud-specific advantages like speed, agility, and scale while retaining on-premises control.  

After reviewing several options, Sanofi chose Microsoft Azure as its strategic cloud platform. With 70 percent of Sanofi’s network already on Microsoft Windows Servers, this was an easy choice. Migrating to Azure would help Sanofi reduce costs and give them the option to explore open source technology approaches with Linux support.  

From months to hours

Undergoing a large-scale migration is no small feat. To help ensure a smooth migration, Sanofi worked with Mobiz, a Microsoft Partner Network member specializing in cloud infrastructure and network automation, digital transformation, and data protection. Sanofi and Mobiz settled on a digital transformation plan that includes migrating more than 15,000 servers and 1,800 apps over the next few years and chose Azure Migrate and Azure VMware Solution to move the workloads.

Despite being early in its migration journey, Sanofi is already seeing major benefits. Thanks to the global presence of Azure, Sanofi is now able to set up new regions in record time at a fraction of the cost. “With help from Microsoft and Mobiz, we were able to deliver a fully qualified landing zone in Azure in one-third the time and at one-third the budget compared to previous cloud efforts. For example, Sanofi can deploy a new fully qualified landing zone in a matter of hours, when previously it would take us six months or more to open a new site or country,” says Sam Chenaur, Vice President and Global Head of Infrastructure at Sanofi. 

Sanofi is also experiencing fewer outages and faster recovery times, reducing their mean time to recovery (MTTR) by 300-500 percent. Overall, these improvements allow Sanofi to manufacture and deliver high-quality medicine without interruption.

Automate everything

So why has Sanofi’s migration been successful thus far? Automation. 

As Hamad Riaz, Chief Executive Officer at Mobiz, puts it: “Our goal at Sanofi is to automate everything. So, when we say ‘100-percent automation’ we mean building automation around infrastructure and building automation engines.” 

Using the Microsoft Cloud Adoption Framework, Sanofi can automatically deploy new infrastructure regions using baseline templates with all necessary requirements automatically added on. “With Azure, we can tie everything together in a single pane of glass: business requirements, compliance, and infrastructure—everything,” says Riaz. “Azure is the most tightly integrated product on the market today.”

From speeding up app development through templates to reducing errors, Sanofi is using Azure automation to work faster and more efficiently. Developers are empowered to build new apps—quickly—using templates, as well as tear down and rebuild outage-prone applications with minimal effort. “Azure gives Sanofi the ability to easily scale workloads up and down as needed in dynamic ways,” says Chenaur. 

With the Azure hybrid cloud offering, Sanofi employees can work faster and more efficiently, which means a faster time-to-market for bringing life-saving medicines to patients. “Digital transformation at Sanofi is driving a more agile business and a faster time to market to manufacture and deliver medicine,” says Chenaur. “The future with Microsoft is incredibly exciting and the possibilities are endless.”

With help from Microsoft and Mobiz, we were able to deliver a fully qualified landing zone in Azure in one-third the time and at one-third the budget compared to previous cloud efforts. 

Sam Chenaur: Vice President and Global Head of Infrastructure,

Sanofi

North Carolina Department of Health and Human Services to Modernize Medicaid Systems Platform with KPMG and Red Hat

North Carolina Department of Health and Human Services to Modernize Medicaid Systems Platform with KPMG and Red Hat

KPMG collaborates with Red Hat to deliver an integrated platform to transform the state’s Medicaid software program for streamlined care delivery

New York, NY – August 17, 2022 – KPMG LLP today announced that the State of North Carolina Department of Health and Human Services has selected the KPMG Resource Integration Suite (KRIS) Connected Platform to integrate multiple technology solutions and enable optimized health outcomes across the state. The KRIS Connected Platform primarily uses industry leading enterprise Kubernetes platform, Red Hat OpenShift, to implement a central systems integration cloud platform and modernize the state’s Medicaid software operations to help streamline the delivery of critical health services.

North Carolina Department of Health and Human Services (NCDHHS) manages the delivery of health and human-related services for North Carolinians, working closely with healthcare professionals, community leaders and advocacy groups and many local, state and federal entities to care for NC residents. The department saw an opportunity to modernize its Medicaid program, a health insurance program for low-income individuals and families who cannot afford health care costs, to better serve citizens and improve the digital experience for care providers in the program. 

With KPMG and Red Hat, NCDHHS is shifting away from its legacy environments to adopt modern, cloud-based applications to integrate disparate systems and comply with Centers for Medicare & Medicaid Services (CMS) guidance, enabling patients and care providers to more safely and efficiently access vital health information. NCDHHS selected the KPMG’s KRIS Connected Platform powered by Red Hat OpenShift to enhance application and data interoperability for its Medicaid software systems.

The KPMG KRIS Connected Platform can enable NCDHHS to respond more rapidly to market demands and helps decrease the overall costs of the state’s technology footprint by utilizing Red Hat OpenShift to support cloud-native, containerized workloads. In addition, NCDHHS will use Red Hat Integration to more seamlessly integrate various applications and technology solutions necessary to address nuanced elements of the Medicaid system.

KPMG and Red Hat’s collaboration continues to provide enhanced hybrid cloud experiences for customers while meeting requirements for regulated industries such as health care. By combining Red Hat’s open source technologies with KPMG’s services, such as the KRIS Connected Platform, customers such as NCDHHS can more easily modernize systems to better meet the needs of patients and care providers. 

###

Red Hat, the Red Hat logo and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries.

Supporting Quotes:

The KRIS Connected Platform has seen a lot of interest from Medicaid and the U.S. Department of Health and Human Services because it is modular, flexible and ideal for meeting the integration and shared services needs of local, state and federal government agencies. Most importantly, it supports the future of hybrid cloud integration, and we are glad to work with Red Hat to bring these transformational opportunities to our clients.

Mark Calem, Advisory Managing Director, Health and Government Solutions, KPMG LLP

By leveraging the KPMG KRIS Connected Platform and Red Hat technology, we are bringing North Carolina’s Medicaid program into the next generation so we meet the long-term needs of North Carolinians. This collaboration has been an important step to ensure our systems are innovative, transparent, and can respond to programmatic changes faster.

Charles Carter, Assistant Secretary of Technology Services, State of North Carolina Department of Health and Human Services

Red Hat collaborates across a vast ecosystem of solution providers, systems integrators, software vendors and more to help our customers successfully modernize and transform IT environments to meet their unique business needs. By integrating Red Hat OpenShift with KPMG’s KRIS platform, North Carolina Department of Health and Human Services can more easily bring together cloud-based, legacy and on-premises systems to streamline operations and provide better user experiences for care providers and patients. We look forward to continuing our collaboration with KPMG in the public sector and beyond.

Chris Gray, Vice President, North America Partner Ecosystem, Red Hat

About KPMG LLP

KPMG LLP is the U.S. firm of the KPMG global organization of independent professional services firms providing audit, tax and advisory services. The KPMG global organization operates in 144 countries and territories and has more than 236,000 people working in member firms around the world. Each KPMG firm is a legally distinct and separate entity and describes itself as such. KPMG International Limited is a private English company limited by guarantee. KPMG International Limited and its related entities do not provide services to clients.

KPMG is widely recognized for being a great place to work and build a career. Our people share a sense of purpose in the work we do, and a strong commitment to community service, inclusion and diversity, and eradicating childhood illiteracy. Learn more at www.kpmg.com/us.

Media Contact

For media inquiries, contact Christine Curtin.

Related contentTechnology and innovationEmerging trends and fresh perspectives.

Instana & Turbonomic Solution Overview

Instana & Turbonomic Solution Overview

Applications power business. When they run well, customers get great experiences and IT and development teams remain focused on top initiatives. By combining Instana and Turbonomic from IBM, you get the high level of observability and application resource management capabilities you need to achieve these goals. Download this solution brief for an overview of the benefits of Instana and Turbonomic together.

Applications power businesses. When they run well, your customers
have a great experience, and your development and infrastructure
teams remain focused on their top initiatives. In today’s world,
applications are becoming more distributed and dynamic as enterprises
embrace new development methodologies and microservices.
Simultaneously, applications are increasingly being deployed across
complex hybrid and multicloud environments.


It has never been more challenging to assure applications deliver
exceptional customer experiences that drive positive business results
and beat the competition. Application architecture and design must be
well executed, and the underlying infrastructure must be resourced to
support the real-time demands of the application. The combination of
Instana and Turbonomic provides higher levels of observability and
trusted actions to continuously optimize and assure application
performance

Contact Elite Paradigm

Why Automation in Integration is the Future

Learn how AI-powered Automation can transform the integration lifecycle and why it makes sense to deploy it in your own organization.

In this era of digital transformation, organizations are facing a surge in data and processes. Hyperautomation, the concept of automating as much as possible to run without human intervention, enables IT teams to manage the surge without requiring more resources.

But does automation really work with application integration, where you traditionally need skilled specialists to connect your infrastructure, operations, and data processes?

AI-powered Automation brings an innovative approach to integration, increasing the speed and lowering the costs of integration projects. Though there are many reasons to automate your integrations, we’ll cover the top four.

What is automation in integration?

In integration, automation refers to the use of artificial intelligence (AI), repeatable formats and low-cost tooling to connect disparate systems and applications from multiple solution providers. The latest automation solutions use operational data to get deeper insights and continuously improve the quality of integrations.

The value of automated integration

Automation is everywhere, with investments on the rise. In a 2020 Deloitte survey of executives worldwide, 73% of respondents said their organizations had embarked on a path to intelligent automation (up from 58% in 2019).

By automating tasks, organizations can develop end-to-end business processes that are more efficient, reliable and scalable. In fact, Gartner expects that by 2023, organizations will be able to run 25% more tasks autonomously.

Four reasons to use automation in integration

Reason 1: Accelerate integration development

Traditional integrations are often time-consuming and costly, requiring systems integrators to make the connections between heterogeneous platforms. Manufacturing systems, project management applications, and customer support portals all need to communicate in real time. But linking them with point-to-point, custom-coded connections can be a maintenance headache. Change or upgrade one system, and you risk breaking all the links.

Automation enables extended teams to create integrations faster. Low-code/no-code integration tooling can leverage built-in natural language processing (NLP) and AI to offer smarter mapping outcomes. Robotic process automation (RPA) helps simplify integrations with legacy apps. The latest tools also come with a shareable asset repository to enable easy reuse of assets, which accelerates integrations.

Reason 2: Boost integration quality

Many vendors are all about the speed of automation systems, but speed isn’t everything. An API built too quickly or without proper testing can cause significant rework, costing time and money and impacting application performance and, ultimately, reputation.

Automation improves the quality of integrations. Robust automation tools apply AI to real-world operational data to get continuous feedback for optimizations, specific to your organization. The embedded AI can provide workflow and field-mapping recommendations, create smarter API test cases and help uncover inefficiencies in your current environment.

Reason 3: Increase efficiency and reduce costs

In our rapidly changing world, flexibility is essential. Organizations have a mix of legacy systems, cloud-native apps and everything in between. A one-size-fits-all style of integration doesn’t fit the realities of today’s environments (or today’s budget constraints).

The latest automation tools are ready for the hybrid IT world, with multiple stakeholders and styles of integration. Key capabilities range from creating app-integration flows and exposing the work for reuse via an API to having continuous backend availability for updates using business-critical messaging. This helps you avoid the multiple licensing fees and complexities of other approaches.

Reason 4: Ensure security, governance and availability

Integration projects can expose organizations to business and security risks. The more human intervention required, the more chance for human errors by end users. Cloud integration also expands the need for robust security and compliance support.

Automated integration enables organizations to update systems of record with integrity and at scale. The latest tools include protection for data at rest and in motion, which is often a regulatory requirement. Resiliency features and auto-scaling functionality help ensure that backend systems can manage workloads without costly and disruptive changes. Organizations can also identify deployment, operations and security issues as they happen, providing data to feed AI for future best practices and asset protection.

Automation, integration and IBM

IBM cloud integration solutions are built on top of powerful automation services so you can rapidly connect applications and share data across an entire ecosystem. Our AI-accelerated approach enables extended teams to meet escalating demand, help reduce costs and increase operational agility.

Find out more about modernizing your integration projects with IBM Cloud Pak® for Integration, which includes everything you need for API management, application and data integration, messaging and events, high-speed data transfers and end-to-end security. Then, to get an expert perspective on your current integration strategy, take our integration assessment.

Enable Unified Analytics

HPE GreenLake edge-to-cloud platform rolls out industry’s first cloud-native unified analytics and data lakehouse cloud services optimized for hybrid environments

IN THIS ARTICLE

  • First cloud-native solution to bring Kubernetes-based Apache Spark analytics and the simplicity of unified data lakehouses using Delta Lake on-premises 
  • Only data fabric to combine S3-native object store, files, streams and databases in one scalable data platform
  • Cloud-native unified analytics platform enables customers to modernize legacy data lakes and warehouses without complex data migration, application rewrites or lock-in
  • 37 solution partners support HPE Ezmeral with 15 joining the HPE Ezmeral Partner Program in the past 60 days

Built on HPE Ezmeral software, analytics and data science teams benefit from frictionless access to data from edge to cloud and a unified platform for accelerated Apache Spark and SQL 

In the Age of Insight, data has become the heart of every digital transformation initiative in every industry, and data analytics has become critical to building successful enterprises. Simply put, data drives competitive advantage.  However, for most organizations, significant challenges remain for organizations to successfully execute data-first modernization initiatives. Until now, organizations have been stuck with legacy analytics platforms that were either built for a pre-cloud era and lack cloud-native capabilities, or require complex migrations to public clouds, risking vendor lock-in, high costs and forcing adoption of new processes. This situation has left the big data and analytics software market1 — which IDC forecasts will reach $110 billion by 2023 – ripe for disruption.

Today, I am excited to announce two disruptive HPE GreenLake cloud services that will enable customers to overcome these trade-offs.  There are four big value propositions we optimized for:
 

1.     Seamless experience for a variety of analytics, SQL, and data science users

2.     Top-notch performance

3.     Choice and open ecosystem by leveraging pure open source in a hybrid environment

4.     An intense focus on reducing TCO by up to 35% for many of the Workloads we are targeting

Built from the ground up to be open and cloud-native, our new HPE GreenLake for analytics cloud services will help enterprises unify, modernize, and analyze all of their data, from edge-to-cloud, in any and every place it’s stored. Now analytics and data science teams can leverage the industry’s first cloud-native solution on-premises, scale up Apache Spark lakehouses, and speed up AI and ML workflows. Today’s news is part of a significant set of new cloud services for the HPE GreenLake edge-to-cloud platform, announced today in a virtual launch event from HPE. The new HPE GreenLake for analytics cloud services include the following:

HPE Ezmeral Unified Analytics

HPE now offers an alternative to customers previously limited to solutions in a hyperscale environment by delivering modern analytics on-premises, enabling up to 35%2 more cost efficiencies than the public cloud for data-intensive, long running jobs typical in mission critical environments. Available on the HPE GreenLake edge-to-cloud platform, HPE Ezmeral Unified Analytics is the industry’s first unified, modern, hybrid analytics and data lakehouse platform.

  • HPE Ezmeral Unified Analytics is the industry’s first unified, modern, hybrid analytics and data lakehouse platform Share

We believe it is the first solution to architecturally optimize and leverage three key advancements simultaneously which no one else in the industry has done.
 

1.     Optimize for a Kubernetes based Spark environment for on-premises deployment providing the cloud-native elasticity and agility customers want

2.     Handle the diversity of data types from files, tables, streams, and objects in one consistent platform to avoid silos and make data engineering easier

3.     Embrace the edge by enabling a data platform environment which can span from edge to hybrid cloud

Instead of requiring all of your data to live in a public cloud, HPE Ezmeral Unified Analytics is optimized for on-premises and hybrid deployments, and uses open source software to ensure as-needed data portability. We designed our solution with the flexibility and scale to accommodate enterprises’ large data sets, or lakehouses, so customers have the elasticity they need for advanced analytics, everywhere.

Just a few key advantages of HPE Ezmeral Unified Analytics include:

  • Dramatic performance acceleration: Together NVIDIA RAPIDS Accelerator for Apache Spark and HPE Ezmeral can accelerate Spark data prep, model training, and visualization by up to 29x3, allowing data scientists and engineers to build, develop, and deploy at scale analytics solutions into production faster.
  • Next-generation architecture: We have built on Kubernetes and added value through an orchestration plane to make it easy to get the scale-out elasticity customers want. Our multi-tenant Kubernetes environment supports a compute-storage separation cloud model, providing the combined performance and elasticity required for advanced analytics, while enabling users to create unified real-time and batch analytics lakehouses with Delta Lake integration.
  • Optimized for data analytics:Enterprises can create a unified data repository for use by data scientists, developers, and analysts, including usage and sharing controls, creating the foundation for a silo-free digital transformation that scales with the business as it grows, and reaches new data sources. Support for NVIDIA Multi-Instance GPU technology enables enterprises to support a variety of workload requirements and maximize efficiency with up to seven instances per GPU.
  • Enhanced collaboration: Integrated workflows from analytics to ML/AI span hybrid clouds and edge locations, including native open-source integrations with Airflow, ML Flow, and Kubeflow technologies to help data science, data engineering, and data analytics teams collaborate and deploy models faster.
  • Choice and no vendor lock-in: On-premises Apache Spark workloads offer the freedom to choose deployment environments, tools, and partners needed to innovate faster

“Today’s news provides the market with more choice in deploying their modern analytics initiatives with a hybrid-native solution, enabling faster access to data, edge to cloud,” said Carl Olofson, Research Vice President, IDC. “HPE Ezmeral is advancing the data analytics market with continued innovations that fill a gap in the market for an on-premises unified analytics platform, helping enterprises unlock insights to outperform the competition.”

HPE Ezmeral Data Fabric Object Store

Our second disruptive new solution is the HPE Ezmeral Data Fabric Object Store: the industry’s first Data Fabric to combine S3-native object store, files, streams and databases in one scalable data platform that spans edge-to-cloud. Available on bare metal and Kubernetes-native deployments, HPE Ezmeral Data Fabric Object Store provides a global view of an enterprise’s dispersed data assets and unified access to all data within a cloud-native model, securely accessible to the most demanding data engineering, data analytics, and data science applications. Designed with native S3 API, and optimized for advanced analytics, HPE Ezmeral Data Fabric Object Store enables customers to orchestrate both apps and data in a single control plane, while delivering the best price for outstanding performance.

We are proud of the innovation that has resulted in what we believe is an industry first: A consistent data platform which is able to handle a diversity of data types, is optimized for analytics, and is able to span from edge to cloud.

Several key features include:

  • Optimized performance for analytics: Designed for scalable object stores, HPE Ezmeral Object Store is the industry’s only solution that supports file, streams, database, and now object data types within a common persistent store, optimized for best performance across edge-to-cloud analytics workloads.
  • Globally synchronized edge-to cloud data: Clusters and data are orchestrated together to support dispersed edge operations, and a single Global Namespace provides simplified access to edge-to-cloud topologies from any application or interface. While data can be mirrored, snapshotted, and replicated, advanced security and policies ensure the right people and applications have access to the right data, when they need it.
  • Continuous scaling: Enterprises can grow as needed by adding nodes and configuring policies for data persistence while the data store handles the rest. 
  • Performance and cost balance: Adapting to small or large objects, auto-tiering policies automatically move data from high-performance storage to low-cost storage.  

Expanding the HPE Ezmeral Partner Ecosystem

We first introduced the HPE Ezmeral Partner Program in March 2021, enabling the rapid creation of streamlined, customized analytics engines and environments based on full stack solutions validated by trusted ISV partners. With 76% of enterprises expecting to be using on-premises, third-party-managed private cloud infrastructure for data and analytics workloads within the next year4, we’re excited to announce six new ISV partners today, including: NVIDIA NGC, Pepperdata, Confluent, Weka, Ahana and gopaddle.

“NVIDIA’s contributions to Apache Spark enable enterprises to process data orders of magnitude faster while significantly lowering infrastructure costs,” said Manuvir Das, head of Enterprise Computing, NVIDIA. “Integrating the NVIDIA RAPIDS Accelerator for Apache Spark and NVIDIA Triton Inference Server into the HPE Ezmeral Unified Analytics Platform streamlines the development and deployment of high-performance analytics, helping customers gain immediate results at lower costs.” 

“Today, companies are using Spark to build their high-performance data applications, accelerating tens to thousands of terabytes of data transitioning from data lakes to AI data modeling,” said Joel Stewart, Vice President Customer Success, Pepperdata. “Pepperdata on HPE Ezmeral Runtime Enterprise can help reduce operating costs and provide deep insights into their Spark applications to improve performance and reliability.”

Since the HPE Ezmeral Partner Program launched, we’ve added 37 solution partners5 to support our customers’ core use cases and workloads, including big data and AI/ML use cases. The Partner Program is also adding support today for open-source projects such as Apache Spark, offering enterprises the ability to transition workloads to a modern, cloud-native architecture.

  • HPE Ezmeral has dozens of new customers, with competitive wins over both traditional big data players, and public cloud vendors Share

HPE GreenLake edge to-cloud platform and HPE Ezmeral are transforming enterprises – and HPE

As an important component of HPE GreenLake cloud services, the HPE Ezmeral software portfolio help enterprises such as GM Financial and Bidtellect advance modern data analytics initiatives. Since it was first introduced in June 2020, HPE Ezmeral has secured dozens of new customers, with significant competitive wins over both traditional big data players, as well as public cloud vendors.

Since vast volumes of applications and data remain will remain on-premises and at the edge as enterprises continue their digital transformations, our elastic, unified analytics solutions will help customers extract maximum value from their data, wherever it lives and moves, from edge-to-cloud. We look forward to working with you to make the most of your data as the Age of Insight continues to reshape enterprises around the world.

Availability and Additional Resources

HPE Ezmeral Unified Analytics and HPE Ezmeral Data Fabric Object Store will be available as HPE GreenLake cloud services beginning November 2021 and Q1 2022, respectively.

Learn more about today’s news from the experts. Join these deep dive sessions as I chat with:

HPE and the HPE logo are trademarks or registered trademarks of HPE and/or its affiliates in the U.S. and other countries.  Third-party trademarks mentioned are the property of their respective owners. 

1 IDC, Worldwide Big Data and Analytics Software Forecast, 2021–2025, July 2021

2 Based on internal HPE competitive analysis, September 2021

Technical Paper: HPE Ezmeral for Apache Spark with NVIDIA GPU, published September 2021

451 Research Voice of the Enterprise: Data & Analytics, Data Platforms 2021

5 Internal HPE documentation on list of partners maintained by the group