Technology moves at an unbelievable pace and we do our best to ensure we keep up. Constantly updating codes, scripts and pipelines helps to keep things running smoothly and efficiently. The biggest issue is matching hardware with software – a year old server is still a powerhouse, but it may not run the latest versions of certain applications. Running down the application version history to see what matches the hardware we have and how it can enhance the quality of the work its used for is almost a full time job in itself.
Looking at new technologies as a natural progression to things we have is also another aspect the above. For example, the team’s BioSLAX initiative, a popular live bioinformatics media, was archived after 8 years of local and world-wide use due to external media getting cheaper and having much more space, thus negating the need for the novel compression techniques employed. Many of the projects that used BioSLAX, such as the BioDB 100 project that used instant reinstantiation of BioSLAX images on a private cloud to start services and databases on demand were also shelved subsequently. Now with emergence of Docker, the project looks at a new lease of life with even a broader use case since the project is no longer restricted to just using BioSLAX live media but can run on any cloud installed operating system with Docker running.
In NUS itself, BIC was the first, even before the computer center, to deploy a private cloud infrastructure and to integrate it into our operations and by modularizing applications and databases completely with all necessary dependencies, we were the first to conceptualize complete application sharing, some 10 years before Docker first made its appearance.
It pays to be 5 steps ahead of the curve.