By Will Wade, Director GRID Product Management, NVIDIA
In its first incarnation, Virtual Desktop Infrastructure (VDI) has shown some very real limitations in terms of performance and data management for many applications such as visualization. There were two fundamental aspects of the challenge: the first was the inability to provide enough performance for the workstations in a virtual environment; the second was the isolation of data on individual workstations that cannot be effectively shared.
Visualization applications are not the only challenge for traditional virtual desktop solutions. Similar issues exist today with standard workflows for product development, architectural projects, design activities, pharmaceutical research and other high-intensity applications. These limitations have had negative consequences on project timelines, costs, and the quality of the results.
Those limitations are no longer. NetApp and NVIDIA now offer a new approach to solving the twin problems of virtualization for demanding applications, including graphics intensive workloads. This new solution involves putting the data, compute, graphics and virtualization in close proximity to one another in the data center. This approach changes the fundamental design point of the infrastructure by moving data, compute and visualization power away from individual remote workstations, making it a shared infrastructure with no latency to the multiple light client seats.
Combining NVIDIA GRID with NetApp’s FlexPod Architecture and FAS storage is an excellent way to solve the performance and data consistency/availability challenge. GRID powers the application, delivering local workstation performance in a virtual environment, while the storage functionality of the NetApp technology enables multiple users to manipulate, update, and share the same files and databases at the same time.
Centralizing the data so that it can be used more effectively in real-time collaboration will begin to deliver the benefits inherent in desktop virtualization by virtualizing applications such as visualization. Added benefits of centralizing data on a NetApp infrastructure include the following:
- Improve data management and protection. Using a central repository allows IT to ensure that data is backed up and easily recovered in the case of failure. IT can also manage any compliance issues that might arise in healthcare, financial services, or other applications that have personal or proprietary information.
- Use of larger and more complex data sets. Many applications that fit this profile have seen continued growth in the size and complexity of the data sets required by end users. When these huge data sets are distributed, many problems may occur, ranging from an increased need for costly bandwidth to lag times when sending information. Centralizing the data and processing power mitigates those cost and performance issues.
- Reduce project time to completion. Without centralized and shared data, some collaborators have to wait for their peers to complete their work and then update/send the files to others before they can start to work. This expands the time necessary to complete projects as the team deals with data bottlenecks that compromise the productivity of other team members. Using a centralized data repository, all contributors on the project have access to the same data sets or files to allow them to optimize their productivity. Providing centralized access will eliminate any individual from having sole access to the files, preventing them from becoming a bottleneck.
- Ability to support the modern workforce. Many industries have demanding workflows and use collaborative graphics applications, including engineering, manufacturing, pharmaceutical research and others. The team is often comprised of employees, partners and individual contractors. Using the traditional approach to infrastructure requires either putting sensitive data outside of corporate control, providing approved devices to the partners/contractors, or deploying additional security on systems that the organization doesn’t own. Centralizing compute and graphics makes it much simpler to support “free range employees.” and enables them to work remotely.
- Improved data security. Centralizing data in the data center and keeping it from physically residing at many different endpoints obviously is a more secure approach, if only from a physical perspective. However, having data centralized on a NetApp appliance also makes it far simpler to implement robust security for this sensitive data, including much more effective access control.
- Simplify movement to cloud. Using NetApp storage solutions to centralize data in the data center makes any transition to a cloud infrastructure more efficient. NetApp has full integration with key cloud data management solutions such as VMware, Citrix and OpenStack.
Moving beyond the on-premise data center, IBM Cloud has just introduced a new remote visualization and collaboration platform, which takes this on-premise data center model to a new level. A live demonstration of this technology can be seen at the NetApp Booth #702, at IBM InterConnect 2016. With this new platform, clients can now truly take advantage of Cloud infrastructure as a service (IaaS) across continents to accelerate their design and research work, with increased security of their content data IP. With the new capabilities found in the IBM Cloud platform which provides NVIDIA GRID, NetApp storage, bare metal servers running leading virtualization applications, along with the client owned application software licenses, in the cloud, IT teams now have the ability to run the most demanding workloads virtually. This new solution helps IT organizations and lines of business gain additional savings from virtualization; protecting sensitive data information,; and improving the collaboration process to drive innovation among geographically distributed teams, all with an OPEX service pricing model that meets their business needs and addresses seasonal fluctuation in demand.