In today’s cloud-centric world, organizations already know they need a cloud solution. The question they are asking is how best to deploy. While there are a multitude of factors that should be considered, proximity is something that organizations tend to overlook due to the cloud’s ubiquitous nature.
The importance of proximity first came to light several years back when organizations began instituting public cloud-first initiatives. Organizations cataloged their applications by if they could or could not (at least not easily) be moved to the cloud. Apps were moved, and developers began to create and innovate with extensive toolsets available from the cloud providers.
Despite some initial success, many of the initiatives stalled. Cost, performance, bandwidth, security and access became barriers to execution. Application refactoring, latency issues and regulatory concerns slowed or stopped forward momentum. Furthermore, organizations discovered that many new applications deployed in the public cloud needed access to legacy data sources and applications that could not be moved into the cloud. The distance between the cloud and the legacy apps became a challenge, and there was a high priority to move legacy applications and data sources as close to the cloud as possible, literally.
To provide this, data center providers began locating their data centers in close physical proximity to the public clouds. This proximity established cost-efficient connectivity with speeds resembling local data center Ethernet. This enabled cloud-native applications and applications that were previously migrated to the public cloud to benefit from data proximity so they can access legacy data sources at speeds necessary to restore performance and end-user experience.
PROXIMITY AND PATH OPTIMIZATION
Proximity also plays a part in the performance of applications. Do you know how many SaaS applications your organization uses today, and how much of your end-user bandwidth is being directed to sites hosted outside your corporate data center?
For years, corporate data centers have been the hub of the enterprise network and have hosted applications that serve the workforce and end users. Traffic from remote or branch locations is directed back to the corporate data center over secure private connections, and from there, internet-bound traffic is routed through a protected internet link.
As network traffic is increasingly directed toward cloud-based applications, traditional data center designs may fall short on performance.
Organizations should consider a design where the hub of the network is located in the middle of the major SaaS and cloud providers that end users are accessing. This will deliver traffic directly to those providers over dedicated connections rather than a best-effort internet pipe.
By having access to redundant, dedicated bandwidth to major cloud and SaaS providers from server racks in the form of a cross-connect, organizations will be able to deliver traffic over the most optimal path with predictable performance. An economic impact assessment can help determine the value of setting up hubs like this in strategic locations around the globe.
Network designs that involve shifting the location of the hub also require a fresh look at the branch connectivity options. Technologies such as SD-WAN and carrier-based virtual private network services can deliver branch connectivity out of a data facility at a cost savings compared to bringing those connections back to corporate data centers.
Cross-connection also has a major security benefit: Internet firewall rules become less complex to manage. Instead of trying to collapse everything into one internet edge firewall, organizations can deploy connection-specific rules to each cross-connection.
PROXIMITY AND CLOUD SECURITY
Some analysts say that public cloud providers are offering safe environments for sensitive data and enabling innovative security services. Despite this, organizations still feel uneasy in regards to security management in the public cloud.
Similar to the proximity-based approach for legacy applications, organizations should consider storing data in a location where it can be accessible to the elastic compute resources offered by cloud providers. Vital data will still reside on an organization’s network in which the control of data and the system hosting it are retained, while surrounding it with the security infrastructure that current operational models are designed to support.
According to IDC, more than 60% of enterprises in the APAC region will have a multicloud strategy by the end of 2018. As we slowly enter the year of multicloud, it is necessary for organizations to keep in mind the importance of proximity in order to bring out the best of a multicloud model. I believe that executing a successful multi-cloud strategy all comes down to location. With proximity being the most unique value offering a data center provider can offer, organizations should prioritize such a factor over others.
Nilesh Mistry, Vice President, Head of APAC, World Wide Technology (WWT)