In the past two decades, advanced cyberinfrastructure has become a critical element of science and engineering research — a result of the increasing scope and accuracy of simulations of natural and engineered systems as well as the growing volume of data generated by instruments, simulations, experiments and observations.
Our ability to view the systems through a computational lens, i.e., using computational abstractions of the domains; and acquire, share, integrate, and analyse disparate types of data has largely benefited the progress made in many domains. However, the advances would not be possible without the advanced data and computational cyberinfrastructure and tools for data capture, integration, analysis, modelling, and simulation.
What is Cyberinfrastructure (CI)?
Understanding complex systems such as biometrics, the human brain or for that matter the environment has been made easy because of the availability of technology. One such technology — Cyberinfrastructure — known as e-research, e-science, and e-infrastructure in Europe, Australia, and Asia; has helped bring together high-performance computing, remote sensors, large data sets, middleware, and sophisticated applications (modelling, simulation, visualisation).
CI consists of computing systems, data storage systems, advanced instruments and data repositories, visualisation environments, and people; all linked by high-speed networks, which makes difficult innovation and discoveries possible, which otherwise are very difficult.
The term CI was first used by the U.S. National Science Foundation (NSF) and refers to information technology systems that provide specifically dominant and advanced capabilities.
The ultimate goal for people creating and implementing Cyberinfrastructure projects is to make use of advanced information technology systems easy. Processors, storage devices, sensors, and other physical assets are all part of a CI, but it is much more than just connecting people with advanced networks and sophisticated applications running on powerful computer systems. It involves people, who help to generate knowledge and in return get expertise, tools, and facilities.
The U.S. Department of Energy, the NASA, and many other organisations also have Cyberinfrastructure plans and development projects. CI is not limited to the sciences, but can also serve the arts, humanities, and social sciences as well.
How does CI work?
CI depends on a technical infrastructure that knits together high-speed networks with high-performance, high-availability, and high-reliability computational resources.
Management systems control the usage, performance, and availability of computing systems, while security systems protect these assets. Data can be large aggregations of previously collected data or live feeds from remote sensors located anywhere in the world. Many of the systems are housed in different locations, and experiments typically run on “virtual machines” in which spare cycles from dozens or hundreds of computers are used for a single task. Datasets can be distributed as well, with neurological data coming from one university and physiological data from another.
Why is it significant?
Data that are collected, archived, and analysed are on a scale that was previously unimaginable; CI tackles this mountain of information, allowing researchers to answer questions that could hardly be asked a decade ago. It also allows those previously unable to join in leading-edge scientific research to participate⎯ to learn by doing, not just by listening.
For instance, CI has the potential to transform the study of language by making data about all the world’s languages accessible to researchers. This linguistic “cognomen” could transform the linguistic sciences.
Where is it going?
CI is in a relatively early stage of development. Research is underway to improve its capacity, reliability, and management. While its origins are in research and science, it is increasingly being applied in the humanities and arts, as well as in education. Furthermore, it also enables increased international collaboration in terms of virtual organisations, sharing of data, and access to tools. With CI, these tools become community resources, on a national and global scale.
What are the downsides?
Because of the scale of many current and proposed CI projects, some believe that the opportunities are limited to large, well-funded research institutions. Individuals at smaller institutions assume they lack the infrastructure to participate, even though many can contribute to and benefit from such projects. A focus on the physical assets necessary for CI fails to adequately characterise the importance of people. Moreover, as with any large-scale, distributed-technology project, security concerns arise for data, instruments, and applications and must be adequately addressed to promote participation.
Try deep learning using MATLAB