Happy Holidays and Happy New Year from Elite Paradigm LLC

Happy Holidays and Happy New Year from Elite Paradigm LLC

During this season, we find ourselves reflecting on the past year and the customers who’ve helped shape our success. In this spirit, the team at ELITE PARADIGM wishes you and yours a happy holiday season!

Get connected with Elite Paradigm LLC. for your 2024 I.T. and Cyber Security needs.

Helping a Healthcare Leader Recover from a Malware Attack

Helping a Healthcare Leader Recover from a Malware Attack

A major healthcare organization struggled to manage in the wake of a malware attack and needed to improve their cybersecurity posture. Read the case study to learn how Unit 42 identified and managed the risks.

Download

https://4719eaee91034be722d8-c86a406a93c55de2464febd03debd4f0.ssl.cf1.rackcdn.com/Helping_a_healthcare_leader_recover_from_a_malware_attack_case_study_.pdf
Sanofi gains speed and agility with an Azure hybrid cloud strategy

Sanofi Gains Speed and Agility with an Azure Hybrid Cloud Strategy

Since its founding in 1973, Sanofi has relentlessly pushed the boundaries of medicine. With a mission to deliver life-saving medical treatments and vaccines to people everywhere, Sanofi relies on its cutting-edge science and manufacturing to make the impossible possible for millions of people around the world. When faced with the limitations of its on-premises infrastructure, Sanofi adopted a hybrid cloud strategy and chose Microsoft Azure as its cloud platform, gaining the speed, agility, and reliability necessary for innovation.

Azure gives Sanofi the ability to easily scale workloads up and down as needed in dynamic ways. 

Sam Chenaur: Vice President and Global Head of Infrastructure

Sanofi

https://play.vidyard.com/Axhm1iXTYC5y7kV4WTKBc7?disable_popouts=1&type=inline&v=4.3.14Transcript

A need for speed

As a global pharmaceutical company, Sanofi must follow stringent compliance, cybersecurity, and quality control requirements. To uphold those requirements, Sanofi has kept data and applications on-premises.

But on-premises infrastructures are costly, cumbersome, and can suffer from slow performance. Outages are another potential disadvantage to physical infrastructures, slowing the ability to launch products in a timely fashion. For instance, if a mission-critical app went down, more time and resources would be spent fixing the problem, rather than on research and development.

To deliver on its mission of transforming the practice of medicine, Sanofi needed a way to work faster and more efficiently while maintaining compliance requirements. The answer? A hybrid cloud strategy, which would allow Sanofi to benefit from cloud-specific advantages like speed, agility, and scale while retaining on-premises control.  

After reviewing several options, Sanofi chose Microsoft Azure as its strategic cloud platform. With 70 percent of Sanofi’s network already on Microsoft Windows Servers, this was an easy choice. Migrating to Azure would help Sanofi reduce costs and give them the option to explore open source technology approaches with Linux support.  

From months to hours

Undergoing a large-scale migration is no small feat. To help ensure a smooth migration, Sanofi worked with Mobiz, a Microsoft Partner Network member specializing in cloud infrastructure and network automation, digital transformation, and data protection. Sanofi and Mobiz settled on a digital transformation plan that includes migrating more than 15,000 servers and 1,800 apps over the next few years and chose Azure Migrate and Azure VMware Solution to move the workloads.

Despite being early in its migration journey, Sanofi is already seeing major benefits. Thanks to the global presence of Azure, Sanofi is now able to set up new regions in record time at a fraction of the cost. “With help from Microsoft and Mobiz, we were able to deliver a fully qualified landing zone in Azure in one-third the time and at one-third the budget compared to previous cloud efforts. For example, Sanofi can deploy a new fully qualified landing zone in a matter of hours, when previously it would take us six months or more to open a new site or country,” says Sam Chenaur, Vice President and Global Head of Infrastructure at Sanofi. 

Sanofi is also experiencing fewer outages and faster recovery times, reducing their mean time to recovery (MTTR) by 300-500 percent. Overall, these improvements allow Sanofi to manufacture and deliver high-quality medicine without interruption.

Automate everything

So why has Sanofi’s migration been successful thus far? Automation. 

As Hamad Riaz, Chief Executive Officer at Mobiz, puts it: “Our goal at Sanofi is to automate everything. So, when we say ‘100-percent automation’ we mean building automation around infrastructure and building automation engines.” 

Using the Microsoft Cloud Adoption Framework, Sanofi can automatically deploy new infrastructure regions using baseline templates with all necessary requirements automatically added on. “With Azure, we can tie everything together in a single pane of glass: business requirements, compliance, and infrastructure—everything,” says Riaz. “Azure is the most tightly integrated product on the market today.”

From speeding up app development through templates to reducing errors, Sanofi is using Azure automation to work faster and more efficiently. Developers are empowered to build new apps—quickly—using templates, as well as tear down and rebuild outage-prone applications with minimal effort. “Azure gives Sanofi the ability to easily scale workloads up and down as needed in dynamic ways,” says Chenaur. 

With the Azure hybrid cloud offering, Sanofi employees can work faster and more efficiently, which means a faster time-to-market for bringing life-saving medicines to patients. “Digital transformation at Sanofi is driving a more agile business and a faster time to market to manufacture and deliver medicine,” says Chenaur. “The future with Microsoft is incredibly exciting and the possibilities are endless.”

With help from Microsoft and Mobiz, we were able to deliver a fully qualified landing zone in Azure in one-third the time and at one-third the budget compared to previous cloud efforts. 

Sam Chenaur: Vice President and Global Head of Infrastructure,

Sanofi

Instana & Turbonomic Solution Overview

Instana & Turbonomic Solution Overview

Applications power business. When they run well, customers get great experiences and IT and development teams remain focused on top initiatives. By combining Instana and Turbonomic from IBM, you get the high level of observability and application resource management capabilities you need to achieve these goals. Download this solution brief for an overview of the benefits of Instana and Turbonomic together.

Applications power businesses. When they run well, your customers
have a great experience, and your development and infrastructure
teams remain focused on their top initiatives. In today’s world,
applications are becoming more distributed and dynamic as enterprises
embrace new development methodologies and microservices.
Simultaneously, applications are increasingly being deployed across
complex hybrid and multicloud environments.


It has never been more challenging to assure applications deliver
exceptional customer experiences that drive positive business results
and beat the competition. Application architecture and design must be
well executed, and the underlying infrastructure must be resourced to
support the real-time demands of the application. The combination of
Instana and Turbonomic provides higher levels of observability and
trusted actions to continuously optimize and assure application
performance

Contact Elite Paradigm

Enable Unified Analytics

HPE GreenLake edge-to-cloud platform rolls out industry’s first cloud-native unified analytics and data lakehouse cloud services optimized for hybrid environments

IN THIS ARTICLE

  • First cloud-native solution to bring Kubernetes-based Apache Spark analytics and the simplicity of unified data lakehouses using Delta Lake on-premises 
  • Only data fabric to combine S3-native object store, files, streams and databases in one scalable data platform
  • Cloud-native unified analytics platform enables customers to modernize legacy data lakes and warehouses without complex data migration, application rewrites or lock-in
  • 37 solution partners support HPE Ezmeral with 15 joining the HPE Ezmeral Partner Program in the past 60 days

Built on HPE Ezmeral software, analytics and data science teams benefit from frictionless access to data from edge to cloud and a unified platform for accelerated Apache Spark and SQL 

In the Age of Insight, data has become the heart of every digital transformation initiative in every industry, and data analytics has become critical to building successful enterprises. Simply put, data drives competitive advantage.  However, for most organizations, significant challenges remain for organizations to successfully execute data-first modernization initiatives. Until now, organizations have been stuck with legacy analytics platforms that were either built for a pre-cloud era and lack cloud-native capabilities, or require complex migrations to public clouds, risking vendor lock-in, high costs and forcing adoption of new processes. This situation has left the big data and analytics software market1 — which IDC forecasts will reach $110 billion by 2023 – ripe for disruption.

Today, I am excited to announce two disruptive HPE GreenLake cloud services that will enable customers to overcome these trade-offs.  There are four big value propositions we optimized for:
 

1.     Seamless experience for a variety of analytics, SQL, and data science users

2.     Top-notch performance

3.     Choice and open ecosystem by leveraging pure open source in a hybrid environment

4.     An intense focus on reducing TCO by up to 35% for many of the Workloads we are targeting

Built from the ground up to be open and cloud-native, our new HPE GreenLake for analytics cloud services will help enterprises unify, modernize, and analyze all of their data, from edge-to-cloud, in any and every place it’s stored. Now analytics and data science teams can leverage the industry’s first cloud-native solution on-premises, scale up Apache Spark lakehouses, and speed up AI and ML workflows. Today’s news is part of a significant set of new cloud services for the HPE GreenLake edge-to-cloud platform, announced today in a virtual launch event from HPE. The new HPE GreenLake for analytics cloud services include the following:

HPE Ezmeral Unified Analytics

HPE now offers an alternative to customers previously limited to solutions in a hyperscale environment by delivering modern analytics on-premises, enabling up to 35%2 more cost efficiencies than the public cloud for data-intensive, long running jobs typical in mission critical environments. Available on the HPE GreenLake edge-to-cloud platform, HPE Ezmeral Unified Analytics is the industry’s first unified, modern, hybrid analytics and data lakehouse platform.

  • HPE Ezmeral Unified Analytics is the industry’s first unified, modern, hybrid analytics and data lakehouse platform Share

We believe it is the first solution to architecturally optimize and leverage three key advancements simultaneously which no one else in the industry has done.
 

1.     Optimize for a Kubernetes based Spark environment for on-premises deployment providing the cloud-native elasticity and agility customers want

2.     Handle the diversity of data types from files, tables, streams, and objects in one consistent platform to avoid silos and make data engineering easier

3.     Embrace the edge by enabling a data platform environment which can span from edge to hybrid cloud

Instead of requiring all of your data to live in a public cloud, HPE Ezmeral Unified Analytics is optimized for on-premises and hybrid deployments, and uses open source software to ensure as-needed data portability. We designed our solution with the flexibility and scale to accommodate enterprises’ large data sets, or lakehouses, so customers have the elasticity they need for advanced analytics, everywhere.

Just a few key advantages of HPE Ezmeral Unified Analytics include:

  • Dramatic performance acceleration: Together NVIDIA RAPIDS Accelerator for Apache Spark and HPE Ezmeral can accelerate Spark data prep, model training, and visualization by up to 29x3, allowing data scientists and engineers to build, develop, and deploy at scale analytics solutions into production faster.
  • Next-generation architecture: We have built on Kubernetes and added value through an orchestration plane to make it easy to get the scale-out elasticity customers want. Our multi-tenant Kubernetes environment supports a compute-storage separation cloud model, providing the combined performance and elasticity required for advanced analytics, while enabling users to create unified real-time and batch analytics lakehouses with Delta Lake integration.
  • Optimized for data analytics:Enterprises can create a unified data repository for use by data scientists, developers, and analysts, including usage and sharing controls, creating the foundation for a silo-free digital transformation that scales with the business as it grows, and reaches new data sources. Support for NVIDIA Multi-Instance GPU technology enables enterprises to support a variety of workload requirements and maximize efficiency with up to seven instances per GPU.
  • Enhanced collaboration: Integrated workflows from analytics to ML/AI span hybrid clouds and edge locations, including native open-source integrations with Airflow, ML Flow, and Kubeflow technologies to help data science, data engineering, and data analytics teams collaborate and deploy models faster.
  • Choice and no vendor lock-in: On-premises Apache Spark workloads offer the freedom to choose deployment environments, tools, and partners needed to innovate faster

“Today’s news provides the market with more choice in deploying their modern analytics initiatives with a hybrid-native solution, enabling faster access to data, edge to cloud,” said Carl Olofson, Research Vice President, IDC. “HPE Ezmeral is advancing the data analytics market with continued innovations that fill a gap in the market for an on-premises unified analytics platform, helping enterprises unlock insights to outperform the competition.”

HPE Ezmeral Data Fabric Object Store

Our second disruptive new solution is the HPE Ezmeral Data Fabric Object Store: the industry’s first Data Fabric to combine S3-native object store, files, streams and databases in one scalable data platform that spans edge-to-cloud. Available on bare metal and Kubernetes-native deployments, HPE Ezmeral Data Fabric Object Store provides a global view of an enterprise’s dispersed data assets and unified access to all data within a cloud-native model, securely accessible to the most demanding data engineering, data analytics, and data science applications. Designed with native S3 API, and optimized for advanced analytics, HPE Ezmeral Data Fabric Object Store enables customers to orchestrate both apps and data in a single control plane, while delivering the best price for outstanding performance.

We are proud of the innovation that has resulted in what we believe is an industry first: A consistent data platform which is able to handle a diversity of data types, is optimized for analytics, and is able to span from edge to cloud.

Several key features include:

  • Optimized performance for analytics: Designed for scalable object stores, HPE Ezmeral Object Store is the industry’s only solution that supports file, streams, database, and now object data types within a common persistent store, optimized for best performance across edge-to-cloud analytics workloads.
  • Globally synchronized edge-to cloud data: Clusters and data are orchestrated together to support dispersed edge operations, and a single Global Namespace provides simplified access to edge-to-cloud topologies from any application or interface. While data can be mirrored, snapshotted, and replicated, advanced security and policies ensure the right people and applications have access to the right data, when they need it.
  • Continuous scaling: Enterprises can grow as needed by adding nodes and configuring policies for data persistence while the data store handles the rest. 
  • Performance and cost balance: Adapting to small or large objects, auto-tiering policies automatically move data from high-performance storage to low-cost storage.  

Expanding the HPE Ezmeral Partner Ecosystem

We first introduced the HPE Ezmeral Partner Program in March 2021, enabling the rapid creation of streamlined, customized analytics engines and environments based on full stack solutions validated by trusted ISV partners. With 76% of enterprises expecting to be using on-premises, third-party-managed private cloud infrastructure for data and analytics workloads within the next year4, we’re excited to announce six new ISV partners today, including: NVIDIA NGC, Pepperdata, Confluent, Weka, Ahana and gopaddle.

“NVIDIA’s contributions to Apache Spark enable enterprises to process data orders of magnitude faster while significantly lowering infrastructure costs,” said Manuvir Das, head of Enterprise Computing, NVIDIA. “Integrating the NVIDIA RAPIDS Accelerator for Apache Spark and NVIDIA Triton Inference Server into the HPE Ezmeral Unified Analytics Platform streamlines the development and deployment of high-performance analytics, helping customers gain immediate results at lower costs.” 

“Today, companies are using Spark to build their high-performance data applications, accelerating tens to thousands of terabytes of data transitioning from data lakes to AI data modeling,” said Joel Stewart, Vice President Customer Success, Pepperdata. “Pepperdata on HPE Ezmeral Runtime Enterprise can help reduce operating costs and provide deep insights into their Spark applications to improve performance and reliability.”

Since the HPE Ezmeral Partner Program launched, we’ve added 37 solution partners5 to support our customers’ core use cases and workloads, including big data and AI/ML use cases. The Partner Program is also adding support today for open-source projects such as Apache Spark, offering enterprises the ability to transition workloads to a modern, cloud-native architecture.

  • HPE Ezmeral has dozens of new customers, with competitive wins over both traditional big data players, and public cloud vendors Share

HPE GreenLake edge to-cloud platform and HPE Ezmeral are transforming enterprises – and HPE

As an important component of HPE GreenLake cloud services, the HPE Ezmeral software portfolio help enterprises such as GM Financial and Bidtellect advance modern data analytics initiatives. Since it was first introduced in June 2020, HPE Ezmeral has secured dozens of new customers, with significant competitive wins over both traditional big data players, as well as public cloud vendors.

Since vast volumes of applications and data remain will remain on-premises and at the edge as enterprises continue their digital transformations, our elastic, unified analytics solutions will help customers extract maximum value from their data, wherever it lives and moves, from edge-to-cloud. We look forward to working with you to make the most of your data as the Age of Insight continues to reshape enterprises around the world.

Availability and Additional Resources

HPE Ezmeral Unified Analytics and HPE Ezmeral Data Fabric Object Store will be available as HPE GreenLake cloud services beginning November 2021 and Q1 2022, respectively.

Learn more about today’s news from the experts. Join these deep dive sessions as I chat with:

HPE and the HPE logo are trademarks or registered trademarks of HPE and/or its affiliates in the U.S. and other countries.  Third-party trademarks mentioned are the property of their respective owners. 

1 IDC, Worldwide Big Data and Analytics Software Forecast, 2021–2025, July 2021

2 Based on internal HPE competitive analysis, September 2021

Technical Paper: HPE Ezmeral for Apache Spark with NVIDIA GPU, published September 2021

451 Research Voice of the Enterprise: Data & Analytics, Data Platforms 2021

5 Internal HPE documentation on list of partners maintained by the group

Networks Are Becoming Cloud-centric. Network Security Must Adapt.

This post is also available in: 日本語 (Japanese)

Today’s digital journey is long and complex, creating equal parts opportunity and risk for organizations. The recent crisis of the pandemic has fueled more complexity in an already complicated world, and the digital landscape has been no exception. Networks have further expanded into the cloud, and organizations have reinvented themselves even while reacting and responding to new circumstances – and new cyberthreats. One question is top of mind: Where do we go from here? It’s clear that cybersecurity is no longer simply a defense. In a world that’s moving from cloud-ready to cloud-centric, cybersecurity has become a critical component in the foundation of the enterprise.

The physical world and the digital world have never been more interconnected and interdependent. You’ve no doubt seen the evidence – employees moving out of their offices, sensitive data and workloads leaving the friendly confines of the data center, legacy and SaaS applications needing to peacefully coexist, and every “thing” connecting to the Internet of Things. Network security is evolving to meet these challenges, and it’s critical to have the right cybersecurity strategy and partner.

Limitations of Legacy Approaches in a Cloud-Centric World

Legacy approaches to securing the network and cloud applications are broken due to several critical limitations:

  • Disjointed, complex SaaS security: Current Cloud-Access Security Brokers (CASB) solutions are complex to deploy and maintain, exist separately from the rest of the security infrastructure, and result in high total cost of ownership (TCO). In addition, they offer subpar security as threats morph and more data and applications reside in a “distributed cloud” that is spread over thousands of SaaS applications, multiple cloud providers and on-premises locations.
  • Reactive security: Legacy network security solutions still rely on a signature-based approach that requires security analysts to hunt down zero-day attacks in retrospect, rather than placing machine learning (ML) inline for realtime prevention. Meanwhile, attackers are using automation and the computing power of the cloud to constantly morph threats. Over the last decade, the numbers of new malware variants have increased from thousands per day to millions per day. In addition, hundreds of thousands of new malicious URLs are created daily, and security based on URL databases must evolve.
  • Lack of holistic identity-based security: The identity of users is no longer confined to on-premises directories. 87% of organizations use or plan to move to a cloud-based directory service to store user identities. Organizations need to configure, maintain and synchronize their network security ecosystem with the multiple identity providers used by an enterprise, which can be time-consuming and resource-intensive. Network security tools don’t apply identity-based security controls consistently, which creates a significant barrier to adopting Zero Trust measures to protect organizations against data breaches. As more people are working from anywhere, they require fast and always-on access to data and applications in the distributed cloud, regardless of location.
  • Trading performance for security: Users are accessing more data-rich applications hosted in the cloud. Performance of network security devices degrades severely when legacy security services and decryption are enabled. That’s why too often in the past, organizations have been forced to choose between performance in order to deliver good user experience or security to keep data and users safe.

Where Network Security Will Go From Here

Today’s distributed cloud operates at hyperscale – storing vast amounts of data and applications, and using near-infinite nodes to store that data. Traffic, especially web traffic, flowing between users and this distributed cloud is growing tremendously. The latest numbers from Google show that up to 98% of this traffic is being encrypted. In order to offer agility and flexibility, organizations moving toward this distributed cloud model aspire to become “cloud like,” providing on-demand access to resources and applications at hyperscale.

To meet the new challenges, security teams need cloud-centric network security solutions that:

a. See and control all applications, including thousands of SaaS applications that employees access daily – and the many new ones that keep becoming available at an incredible cloud velocity – using a risk-based approach for prioritization that takes into account data protection and compliance.

b. Stop known and unknown threats, including zero-day web attacks, in near realtime.

c. Enable access for the right users, irrespective of where user identity data is stored – on-premises, in the cloud or a hybrid of both.

d. Offer comprehensive security, including decryption, without compromising performance, allowing security to keep pace with growing numbers of users, devices and applications.

e. Have integrated, inline and simple security controls that are straightforward to set up and operate.

Palo Alto Networks has a 15-year history of delivering best-in-class security. We’re here to help secure the next steps on the digital journey, wherever they take us. Whether you’re a seasoned traveler or just starting out, we can help our customers find a new approach to network security – one that better matches today’s cloud-centric networks. What’s next for us will be revealed soon. Follow us on LinkedIn to be the first to know about our upcoming events.

Read More…

Hybrid Cloud Data CenterNetwork PerimeterNext-Generation FirewallsPoints of ViewCASBcloud-centricdata centerhybrid cloudnetwork securitySaaS