GROMACS
GROningen MAchine for Chemical Simulations (GROMACS) is a molecular dynamics package mainly designed for simulations of proteins, lipids, and nucleic acids. It was originally developed in the Biophysical Chemistry department of University of Groningen, and is now maintained by contributors in universities and research centers worldwide.[4][5][6] GROMACS is one of the fastest and most popular software packages available,[7][8] and can run on central processing units (CPUs) and graphics processing units (GPUs).[9] It is free, open-source software released under the GNU General Public License (GPL),[3] and starting with version 4.6, the GNU Lesser General Public License (LGPL).
Developer(s) | University of Groningen Royal Institute of Technology Uppsala University[1] |
---|---|
Initial release | 1991 |
Stable release | 2020.3
/ 9 July 2020[2] |
Repository | |
Written in | C++, C |
Operating system | Linux, macOS, Windows, any other Unix variety |
Platform | Many |
Available in | English |
Type | Molecular dynamics simulation |
License | LGPL versions >= 4.6, GPL versions < 4.6[3] |
Website | www |
History
The GROMACS project originally began in 1991 at Department of Biophysical Chemistry, University of Groningen, Netherlands (1991–2000). The goal was to construct a dedicated parallel computer system for molecular simulations, based on a ring architecture. The molecular dynamics specific routines were rewritten in the programming language C from the Fortran 77-based program GROMOS, which had been developed in the same group.
Since 2001, GROMACS is developed by the GROMACS development teams at the Royal Institute of Technology and Uppsala University, Sweden.
Features
GROMACS is operated via the command-line interface, and can use files for input and output. It provides calculation progress and estimated time of arrival (ETA) feedback, a trajectory viewer, and an extensive library for trajectory analysis.[3] In addition, support for different force fields makes GROMACS very flexible. It can be executed in parallel, using Message Passing Interface (MPI) or threads. It contains a script to convert molecular coordinates from Protein Data Bank (PDB) files into the formats it uses internally. Once a configuration file for the simulation of several molecules (possibly including solvent) has been created, the simulation run (which can be time-consuming) produces a trajectory file, describing the movements of the atoms over time. That file can then be analyzed or visualized with several supplied tools.[10] OpenCL and CUDA are possible for actual GPUs of AMD and Nvidia with great acceleration against CPU based runs since Version 5 or higher.
Easter eggs
As of January 2010, GROMACS' source code contains approximately 400 alternative acronyms to GROMACS as jokes among the developers and biochemistry researchers. These include "Gromacs Runs On Most of All Computer Systems", "Gromacs Runs One Microsecond At Cannonball Speeds", "Good ROcking Metal Altar for Chronical Sinner", "Working on GRowing Old MAkes el Chrono Sweat", and "Great Red Owns Many ACres of Sand". They are randomly selected to possibly appear in GROMACS's output stream. In one instance, such an acronym caused offense.[11]
Applications
Under a non-GPL license, GROMACS is widely used in the Folding@home distributed computing project for simulations of protein folding, where it the base code for the project's largest and most regularly used series of calculation cores.[12][13] EvoGrid, a distributed computing project to evolve artificial life, also employs GROMACS.[14]
See also
- OPLS
- GROMOS
- CHARMM
- NAMD
- Yasara
- Comparison of force field implementations
- Abalone (molecular mechanics)
- Grace (plotting tool)
- Folding@home
- Rosetta@home
- Comparison of software for molecular mechanics modeling
- Molecular design software
- OpenMM
- Bennett acceptance ratio
- VisIt
- VOTCA
References
- The GROMACS development team
- "Gromacs Downloads". gromacs.org. Retrieved 2020-08-14.
- "About Gromacs". gromacs.org. 16 August 2010. Retrieved 2012-06-26.
- "People — Gromacs". gromacs.org. 14 March 2012. Retrieved 26 June 2012.
- Van Der Spoel D, Lindahl E, Hess B, Groenhof G, Mark AE, Berendsen HJ (2005). "GROMACS: fast, flexible, and free". J Comput Chem. 26 (16): 1701–18. doi:10.1002/jcc.20291. PMID 16211538.
- Hess B, Kutzner C, Van Der Spoel D, Lindahl E (2008). "GROMACS 4: Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation". J Chem Theory Comput. 4 (2): 435–447. doi:10.1021/ct700301q. hdl:11858/00-001M-0000-0012-DDBF-0. PMID 26620784.
- Carsten Kutzner; David Van Der Spoel; Martin Fechner; Erik Lindahl; Udo W. Schmitt; Bert L. De Groot; Helmut Grubmüller (2007). "Speeding up parallel GROMACS on high-latency networks". Journal of Computational Chemistry. 28 (12): 2075–2084. doi:10.1002/jcc.20703. hdl:11858/00-001M-0000-0012-E29A-0. PMID 17405124.
- Berk Hess; Carsten Kutzner; David van der Spoel; Erik Lindahl (2008). "GROMACS 4: Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation". Journal of Chemical Theory and Computation. 4 (3): 435–447. doi:10.1021/ct700301q. hdl:11858/00-001M-0000-0012-DDBF-0. PMID 26620784.
- "GPUs — Gromacs". gromacs.org. 20 January 2012. Retrieved 26 June 2012.
- "GROMACS flow chart". gromacs.org. 18 January 2009. Archived from the original on 24 June 2010. Retrieved 26 June 2012.
- "Re: Working on Giving Russians Opium May Alter Current Situation". Folding@home. 17 January 2010. Retrieved 2012-06-26.
- Pande lab (11 June 2012). "Folding@home Open Source FAQ". Folding@home. Stanford University. Archived from the original (FAQ) on 17 July 2012. Retrieved 26 June 2012.
- Adam Beberg; Daniel Ensign; Guha Jayachandran; Siraj Khaliq; Vijay Pande (2009). Folding@home: Lessons From Eight Years of Volunteer Distributed Computing (PDF). Parallel & Distributed Processing, IEEE International Symposium. pp. 1–8. doi:10.1109/IPDPS.2009.5160922. ISBN 978-1-4244-3751-1. ISSN 1530-2075.
- Markoff, John (29 September 2009). "Wanted: Home Computers to Join in Research on Artificial Life". The New York Times. Retrieved 26 June 2012.
External links
- Official website
- GROMACS on GPUs
- Binaries of GROMACS 4.6.5 for Windows / Cygwin
- GROMACS on the bwHPC Clusters in Germany