Understanding star formation is one of the major challenges of modern astrophysics. It has been identified as one of the four key questions by ASTRONET, as recently published in « A Science Vision for European Astronomy ». Indeed, very few problems in astrophysics have no link to stars. As such, the field has received considerable attention over the years both observationally and theoretically.
With the improvement of observational facilities, in terms of sensitivity as well as spatial resolution, in the sub-millimeter and in the infrared, much progress has been achieved, leading to a now well-accepted theory of star formation. Stars form inside molecular clouds through the gravitational collapse of prestellar dense cores. The dynamical evolution of these clouds is governed by the intricate interplay between self-gravity on the one hand side and supersonic turbulence, magnetic fields, and thermal pressure on the other. After the young protostar forms, it accretes the infalling surrounding envelope. It is also commonly accepted that during this accretion phase, circumstellar disks form, outflows and jets are launched and fragmentation into a few objects may occur. More specifically, large surveys of molecular clouds have been conducted which provide reliable statistical information about their spatial and velocity structure as well as the distribution of clumps and cores. In addition, observations with very high spatial resolution have provided new insight into the immediate environment of individual protostars and allow us now to study the scale on which fragmentation is thought to occur with unprecedented detail.
In parallel to the observational studies, important theoretical and numerical efforts have been carried out. In particular, large numerical simulations have been performed to model both the large scales of the interstellar medium and the formation of molecular clouds, the internal dynamics of the molecular clouds, and the collapse of individual objects. At the same time, there has been considerable progress in modeling the microphysical processes which are central to understanding the heating and cooling of the gas and to obtaining reliable diagnostics of its chemical and thermodynamical state as well as its dynamical evolution. It has proved crucial to adequately treat radiative transfer in the continuum as well as in atomic and molecular emission lines. The advent of novel algorithms and highly efficient software packages to solve the multi-dimensional radiative transfer equations has provided additional stimulus to this field.
Despite the enormous progress in recent years, there are still many open questions and our understanding of stellar birth is far from complete. Even explanations of very fundamental issues, such as the low star formation rate in our Galaxy or the physical origin of the distribution of stellar masses at birth (the IMF) are still under debate. The conditions under which the cores collapse and fragment are still very uncertain. In a related way, the scenario for the formation of circumstellar centrifugally supported and magnetized disks requires significant clarification. New generations of sub-mm and infrared telescopes, Earth bound as well as space borne, such as ALMA or Herschel, will provide a wealth of new data with unprecedented sensitivity and resolution. The physical interpretation of these high- precision observations requires new theoretical models with high predictive power. It can only be achieved by combining gas dynamics with chemistry and radiation. This is the basis of the simulation data presented here. The goal is to provide a set of molecular cloud models which can be directly compared to observational data. This includes a large number of statistical characteristics and synthetic line emission maps in various tracer molecules.
The organisation of the database closely reflects the theoretical data model, which has been developed in the context of the International Virtual Observatory in order to ensure the interoperability with other astrophysics projects. Therefore, it follows a hierarchical structure.
The database is organised in projects, which assemble sets of related numerical simulations, which all investigate a specific problem such as for example molecular cloud formation or prestellar core collapse.
Each simulation is referring to a protocol, which can be defined as a peculiar binary file for a modelisation code. Typically, it defines a class of numerical experiments determined by a small number of parameters (such as for exemple the mass and the initial temperature and density of a prestellar core). For all protocols, a description is given including the applied physics and the control parameters.
For any protocol, series of numerical experiments have been performed. An experiment corresponds to a specific run of a given protocol, that is running the code for a given set of parameters. A description of the experiment is given including, of course, the value of the chosen parameters.
Then, all simulations are organized in snapshot, which is the results of the numerical experiment at a particular time. Usually few snapshots are available. In the database, the snapshots are characterized by few relevant quantities, which can be for example, statistical indicators as well as images.
Finally, for all snapshots, a set of post-processings can be provided. A post-processings is the results of a particular procedure applied to the results of the snapshots. For example, this can be a clump extraction or some radiative transfer.
Two major difficulties, which reflect the complexity of the star formation process, should be kept in mind.
First, it is necessary to understand the limit of the available numerical models in terms of physical processes but also in terms of spatial scales or equivalently numerical resolution. The interstellar medium is a highly complex environment in which turbulence, shocks or radiative transfer are taking place as well as many chemical processes. Most of these processes are not fully understood (e.g. turbulence). It is clearly not possible to include all these processes in the same model and one must therefore choose to focus on few aspects. Even when included, it is generally the case that intrinsic restrictions will limit the accuracy with which it is described (e.g. the numerical resolution in the case of turbulence drastically limits the effective Reynolds number to values orders of magnitude smaller than the value it has in reality). Numerical resolution is a particularly important parameter. Not only the scale below the resolution (size of the smallest computing cells) are by definition not described but scales above 10 times this value are likely to be affected by the numerical diffusion.
Second, the interstellar medium is a multi-scale environment, ranging from the size of galaxies to the radius of stars, which presents a wide range of physical conditions in terms of density, temperature, radiation field, etc. It is clearly not possible to cover them entirely. In the database only a small subset is represented. The database will grow with time as new numerical experiments will be performed but it will never be possible to cover all scales and conditions.
Terms and conditions:
The data and information available through the StarFormat database and website are available under the terms of the Open Database Licence (ODbL). Any rights in individual contents of the database are licensed under the Database Contents License.
Read the licence extensively in english, french or german. Basically, it means:
You are free: To Share , to Create , to Adapt . That means you can copy, distribute and use the database. You can produce works from the database. You can modify, transform and build upon the database.
As long as you:
- Attribute: You must attribute any public use of the database, or works produced from the database, in the manner specified in the ODbL. For any use or redistribution of the database, or works produced from it, you must make clear to others the license of the database and keep intact any notices on the original database.
- Share-Alike: If you publicly use any adapted version of this database, or works produced from an adapted database, you must also offer that adapted database under the ODbL.
- Keep open: If you redistribute the database, or an adapted version of it, then you may use technological measures that restrict the work (such as DRM) as long as you also redistribute a version without such measures.