STRATEGIC HPC DEVELOPMENT IN 2016
Significant milestones were achieved and new developments were initiated in areas such as HPC resource provisioning, research network development and Big Data analytics infrastructure development.
HPC Resource Provisioning
More than one petaflops of computing power was made available to NUS researchers with the launch of the National Supercomputing Centre (NSCC) in April 2016. NSCC was jointly developed by NUS, A*Star, NTU and SUTD. Since the launch, more than 500 researchers from NUS have registered to use NSCC resources. Besides the HPC cluster with more than 30,000 CPU cores, researchers can also access a GPGPU cluster, a visualization system and petabytes of storage capacity for their research work. Computer Centre is providing local support for the use of NSCC resources.
On campus, Computer Centre has initiated hardware refresh for some of its central resources. Users will have access to new cluster nodes with 24 cores and 192GB per node in 2Q 2017.
To further enhance our research computing support, we plan to harness the Cloud resources to cater for ad-hoc and special HPC requirements (both hardware and software) in the coming years.
Research Network Development
An agreement has been established between NUS and NSCC to extend the NSCC-NUS network (currently with 40Gbps capacity) to various locations across campus. Besides extending the reach, the capacity will also be upgraded to support up to 100Gbps bandwidth.
With this dedicated research network, a more appropriate set of network and data transfer policies and technologies can be implemented to enable research entities to make full use of the bandwidth capacity provided. The development is expected to benefit data intensive research collaboration in areas such as Biomedical, Healthcare, Smart Nation and various computational science and engineering research.
Computer Centre is working closely with NSCC, University Campus Infrastructure and Office of Deputy President Research & Technology on this project. This development is expected to take two years to complete.
Big Data Analytics Infrastructure Development
In addition to the research network development, other critical components for the support of Big Data analytics include the development of computing platform and storage capacity.
Over the years we have been providing Utility Storage Service to support research data storage using the commodity based storage hardware. With increasing demand for large-scale data storage for research, we have started migrating the service from the commodity hardware to the enterprise hardware, while keeping the cost affordable. The enterprise hardware offers greater reliability and scalability. The enterprise storage system enables fast provisioning and seamless scaling of up to hundreds of terabytes. This resource is suitable for long-term storage and sharing of research data.
To provide a generic Big Data computing platform for research, a Hadoop computing testbed has been set up at the Computer Centre. Currently the testbed is made up of a compute cluster with local storage. We are currently working on enhancing this computing platform with the introduction of a high-performance enterprise storage system to work with the compute cluster. The enterprise storage will offer greater reliability and scalability. With the enhancement, the Hadoop system can be used not only for analytics but also for research data repository and sharing. The enhancement is expected to be completed by April 2017.