Vendors and analysts all agree that IT faces the need to change. Many industry leaders say a new style of IT is being formed, and this new way of delivering IT is critical in meeting new and emerging business demands.
Users today want IT to be delivered and consumed in the same way that people use their tablets and mobile phone. The challenge for IT organizations is how to deliver a similar experience given the complexity of most modern organizations--and the legacy systems that often hinder this essential change.
One area where change is imminent and reaching critical mass is storage. Big data, the need for high performance analytics, and ongoing massive storage growth, all mean that today's infrastructure must evolve to respond and deliver platforms that enable massive scale as well as on-demand processing performance.
Speakers at the recent HP Storage Summit held in Hong Kong revealed the specific challenges plaguing many IT leaders and some possible ways to solve this growing complexity.
According to Paul Haverfield, principal technologist, HP Storage Asia Pacific & Japan, the latest compound annual growth rates of data in Hong Kong (based on current 'business-as-usual' scenarios) are currently around 46-50%.
"And that's pedestrian compared to the scary prospect of what we can expect in the next few years of growth with big data," said Kaz Kempers, head of IT, ABN AMRO Clearing Hong Kong, who joined Haverfield and fellow banking IT executive Mike McCarthy in a panel discussion during the summit.
"Add metadata and various other unstructured data sources, and this will cause a massive wave of new data," said Kempers. "I think that trying to cope with this sheer data growth efficiently and in a manageable way is one of the biggest challenges we face today."
Mike McCarthy, regional head, Group Infrastructure Services, Technology Services, RBS, noted that in addition to the data storage growth, the issue of knowing what data each company has and where each bit of data resides is still a huge challenge.
McCarthy said that while big data garners the headlines, many large enterprises like his own still seek ways to cope with current data growth rates, with growing demands from developers and business for greater speed and scale.
Add the swathe of regulatory compliance needs, which continue to swell the already burgeoning archive demands. "In our business," said McCarthy, "many countries across the region still insist that companies retain customer data within their shores and also for long periods of time--which adds lifecycle management to the heap of data challenges."
Kempers said the fundamental challenge is undergoing continual cycles of upgrades to expand storage capacity, optimize the use of existing systems, and seeking new technologies to make everything simpler and cheaper to manage.
The big data issue continues to expand as data is no longer just numbers and spreadsheets, but also video and audio, said McCarthy. "We live in a world where we are now required to record people's mobile voice communications and their Blackberry messages if they are staff working on the trading floor," he said.
The solution in the past was to just add more disk space, but increasingly that no longer suffices. "Unfortunately slamming in more disks is still something we all have to do," admitted Kempers. "But adding in virtualization ensures that we use all of the disk that we add and helps us leverage our infrastructure in an optimum manner.
Cloud to the rescue?
The game of predicting future storage requirements and doing your best to avoid over-booking new storage systems remains, but cloud and other technology considerations are being explored simultaneously.
"It's still very easy to fall into the 'slam in more disks' syndrome, which is what I myself do at home,' said HP's Haverfield. He added that many businesses in Asia are lucky in that the majority are slightly behind the US and Europe in the technology adoption curve, so there is often spare capacity to add more disk space.
"But that capacity won't last long," said Haverfield. "In just five years time, it's predicted Asia Pacific will overtake North America in net data holdings. That means that at some point everyone will hit a brick wall with data growth."
While companies will continue to add disks to alleviate storage problems, Haverfield stressed that businesses must explore better data lifecycle management and archiving practices to maximize new resources.
He advised IT leaders to start considering these challenges now, and begin planning for critical decisions on what data architecture to follow and what key platforms to adopt when deciding the best approach to meeting new business needs.
The approach at RBS is to try and reduce the amount of storage estate where possible to make things more manageable. "I'm sold on the idea of cloud but we need more cooperation from the industry, regulators and providers," said McCarthy. "Data has become so important while at the same time increasingly difficult to manage that we need specialist providers to help, but if you go all the way to the external cloud, regulators and banks are nervous about that scenario."
As a result, while many businesses have begun leveraging cloud-based storage, cloud DR, backup, and archiving solutions, in banking and finance, most institutions can only pursue building their own private cloud platforms to ease the data management burden.
Virtual storage 2.0
Virtualization of storage has long been touted as the solution to rising management complexity, and it has come to the fore again as vendor-supplied technology has matured, allowing them to deliver more complete offerings.
Virtualization plays a key role in creating better usage and manageability, said Kempers. "This is exactly what cloud is doing," he said. "By virtualizing and adding management tools internally, we are trying to deliver the ease-of-use that you can get from the cloud today." He added that current key focus areas for him include better ways of classifying data, and seeking ways to deal with data sovereignty issues.
Kempers stressed that virtualized systems are not currently applicable to everything. "It's important to assess what you are using your storage and data systems for, because you cannot virtualize everything," he said. He pointed to high-speed trading platforms as an example where despite improved latency of virtual systems, it is still preferable to use dedicated hardware for processing and storage to achieve the low latency required.
Storage virtualization is nonetheless making significant impacts in many areas of enterprise infrastructure, due to support for concepts like software-defined storage by big vendors such as VMware and Microsoft. Haverfield predicts further adoption of virtualized storage in the near future.
The HP storage technologist said that early efforts to virtualize storage simply tried to pool a variety of multi-vendor storage systems, but this patchwork effort often created new complexity and management costs. Software-defined storage (SDS) is helping create new levels of automation and ease of management.
The notion of software-defined storage promises to deliver on the concept of breaking the proprietary links to hardware and separating the software layer from previously dedicated hardware systems.
According to Haverfield, SDS is a viable technology today for small and large businesses. Service providers have typically been early adopters, with one Japanese provider using SDS to cope with peak demands during seasonal peaks such as cherry blossom season.
By using SDS technology from HP, the service provider in Japan has been able to rapidly provision new infrastructure to cope with huge peaks in processing requirements through their IT-as-a-service infrastructure, said Haverfield. Once demands return to normal, then the infrastructure automatically scales down to deliver normal operations, he said.