Much has changed for businesses over the past 40 years. In the 1980s, the growth of personal computers led to microcomputers (servers), and in the 1990s, data centers became commonplace. Next, virtualization and the need to address an explosion of data-driven data center growth in the early 2000s. When Amazon launched its Commercial Web Service (EC2) in 2006, cloud computing dramatically changed the way whose companies manage their data – and their activities.

As an IT analyst, Martin Hingley, President and Market Analyst at IT Candor Limited, based in Oxford, UK, has found himself at the forefront of all these changes. In a recent BriefingsDirect podcast, Dana Gardner, Senior Analyst at Interarbor Solutions, discusses some of these changes with Hingley. Both analysts examine how artificial intelligence, orchestration, and automation help control the complexity of continuous change.

After 30 years of evolution of computer data centers, are we closer to simplicity?

Gardner began the interview by asking Hingley if a new era of new technologies was helping organizations better manage IT complexity. Hingley responded, "I've been an IT analyst for 35 years and it's still the same. Each generation of systems arrives and takes over from the last, which has always left the operators the problem of managing the new with l & # 39; old. "

Hingley recalled the shift to the client / server model in the late '80s and early' 90s with the influx of PC servers. "At this point, administrators had to manage all these new systems and could not manage them under the same structure. Of course, this problem has continued over time. "

The complexity of management is particularly difficult for large organizations because they have a huge array of resources. "The cloud did not help," said Hingley. "The cloud is very different from your internal computer hardware: you program it, you develop it. He has a magnificent cost proposition; at least at the beginning. But now, of course, these companies have to face all this complexity. Managing multi-cloud resources (private and public) combined with traditional computing is much more difficult.

Massive amounts of data: put your house in order using the AI

In addition, consumers and businesses create huge amounts of data, which are not filtered properly. According to Hingley, "Each airliner crossing the Atlantic generates 5 TB of data; and how many of them cross the Atlantic each day? In order to properly analyze this amount of data, we need better techniques to extract valuable data. "You can not do it with people. You must use artificial intelligence (AI) and machine learning (ML). "

Hingley emphasized the importance of mastering your data – not only for simplicity, but also for better governance. For example, the European Union (EU) General Data Protection Regulation (GPR) ) changes the way the organization must handle profound consequences for all businesses.

"The problem is that you need a unique version of the truth," says Hingley. "Many IT organizations do not have that. If they are summoned to provide each email that has the word Monte Carlo inside, they could not do it. There are probably 25 copies of all e-mails. There is no way to organize it. Data governance is extremely important. it's not good to have, it's essentialto have. These regulations are coming – not just in the EU; The GDPR is being adopted in many countries. "

Cloud composable and software defined

In addition to artificial intelligence, organizations will also need to create a common approach to cloud, multi-cloud and hybrid cloud deployment, simplifying the management of various resources. Gardner cited the latest news from Hewlett Packard Enterprise (HPE) as an example.

Announced in November 2018, HPE Composable Cloud is the first integrated software stack designed for composable environments. Optimized for applications running on virtual machines, containers, clouds or on bare metal, this hybrid cloud platform offers customers the speed, efficiency, size and flexibility profitability of public cloud providers. These benefits are made possible by integrated intelligence-based operations with HPE InfoSight, intelligent storage capabilities, an innovative structure designed for composable environments and by HPE OneSphere, the hybrid cloud management solution as service.

"I like what HPE does, especially the mix of different resources, "Hingley said. "You also have the HPE GreenLake model below, so you can only pay for what you use. You must be able to combine all these elements, as does HPE. In addition, in terms of architecture, approach to network structure, software-defined approach, API connections, these are essential for moving forward. "

Automation and optimization of the whole of computer science

New levels of maturity and composability help organizations improve IT management in complex IT environments that are constantly changing and growing. Getting a global view of IT could finally lead to automation and optimization of multi-cloud, hybrid and legacy IT assets. Once this challenge is met, businesses will be better prepared to take on the next one.

To hear the full interview, click here. To learn more about HPE Composable Cloud, read the press release: Hewlett Packard Enterprise expands the composition strategy with new features to accelerate customer innovation. Learn more about HPE OneSphere, the hybrid cloud management solution as a service. To learn more about simplifying your digital transformation, click here.


About Chris Purcell

chris purcell e1510269412532Chris Purcell leads analyst relations for the software-defined group and cloud group at Hewlett Packard Enterprise. The Software-Defined and Cloud Group is responsible for the marketing of the HPE OneView, Hyperconverged HPE SimpliVity and Project New Hybrid IT Stack. To learn more about Chris Purcell, visit the HPE Shifting to Software-Defined blog.