Join top executives in San Francisco July 11-12 to hear how leaders are integrating and optimizing AI investments for success. Learn more
Hewlett Packard Business (HPE) is overhauling its Ezmeral software portfolio and simplifying data organization, management and analysis for organizations advancing their artificial intelligence (AI) and machine learning (ML) workflows.
HPE Ezmeral, the enterprise software division of HPE, has grown in recent years through a series of acquisitions. In 2018, HPE acquired Blue Data software, which introduced new capabilities for using application containers to manage data to support AI/ML efforts. In 2019, it acquired data platform provider MapR, which brought functionality for Hadoop Distributed File System (HDFS). In 2021, HPE acquired Ampool, adding a distributed SQL engine for data queries. And on May 2, HPE assumed most of the leadership and engineering team behind the recent acquisition arryto, a major contributor to the open source Kubeflow ML workflow project. HPE Ezmeral has also supported many other open source initiatives that are critical to modern AI and data operations, including Spark Apache, apache Air flow, Apache Supersets, ML flow, Party AND radius.
HPE Ezmeral today announced that it is organizing and refreshing its portfolio with two main streams. The new HPE Ezmeral Data Fabric software suite will consolidate the company’s data management technologies. HPE Ezmeral Unified Analytics Software is the new offering that will bring together all the tools and technologies organizations need to manage data analytics and AI/ML workloads.
“We are introducing the new streamlined portfolio where we have chosen to move away from essentially building custom solutions for container platforms, MLOps and Spark,” Mohan Rajagopalan, vice president and general manager of HPE Ezmeral Software, told VentureBeat. “We want to focus on providing core functionality that helps our customers develop and deploy applications in a hybrid multicloud world.”
Why HPE is simplifying the Ezmeral software portfolio
Rajagopalan explained that previous company acquisitions had meant continuing with the acquired companies’ technology, under the umbrella of HPE.
“Today, what we’re doing is basically taking a step back and looking at how these various technologies interact, and more importantly, spending time with our customers to figure out where we should be playing,” Rajagopalan said.
What is now obvious to HPE and many others is that data is critical to the success of modern businesses. Data helps drive decision making with analytics and business intelligence. Data is also the foundation for AI/ML.
“Almost all customers say data is the future, and whoever has more data has better insights,” he said. “Insight generation technology will continue to evolve over time.”
Having data means having the right tools to manage data and make sense of it. It’s also about having the infrastructure that supports your data initiatives. To this end, Rajagopalan noted that HPE Ezmeral complements the HPE Greenlake portfolio. Ezmeral focuses solely on the software layers of the stack, while Greenlake has a more infrastructural view.
Why data fabric matters

Data in a modern enterprise is often found in many different locations and applications. With HPE Ezmeral Data Fabric software, the goal is to provide a suite of capabilities to help businesses organize, manage and connect data.
Rajagopalan explained that with Data Fabric, HPE is providing capabilities for data found in files, streaming data sources, and databases. The basic idea is to enable a federated data layer, where different sources can be connected in an approach that allows organizations to take advantage of all their data.
Functionally, what Data Fabric is enabling is a hybrid data lakehouse, where data can be stored on-premises, at the edge or in the cloud and is available for data analysis.
“We create data lakehouses where customers can simply pump data in a variety of formats into the fabric. It can be different types of data and it can be in different places,” Rajagopalan said. “Data Fabric just makes the magic happen where the data appears where it needs to be consumed.”
Unified analytics is powered by open source tools
Data alone is not enough to be useful. This is where the HPE Ezmeral Unified Analytics software suite comes into play.
“What we’ve done is basically take the best open source technologies, so think Superset, Kubeflow, Airflow, Feast and Ray, and we package them with enterprise-grade guardrails under this umbrella called Unified Analytics,” Rajagopalan said.
With all the hype and excitement around large language models (LLMs) and the overwhelming success of OpenAI’s ChatGPT, many organizations are looking to take advantage of AI. While the largest LLMs require massive amounts of data, Rajagopalan stressed that there is considerable value in the data companies already own that could potentially be used for AI modeling.
“We want to give them all the tools so they can start building their own domain-specific models. They could be LLM-centric, they could be chat-centric, they could be workflow-centric,” Rajagopalan said. “Our thesis here is that enterprises are sitting on treasure troves of data.”
VentureBeat’s mission it is to be a digital city square for technical decision makers to gain insights into transformative business technology and transactions. Discover our Briefings.