Large-Scale Scientific Information Systems
Research Group

 
home publications projects openings for students team contact & imprint

Big Data Standardization

Standards establish interoperability, making tools work together across vendors, platforms, and applications. While the field of Big Data is large and diverse (and the term "Big Data" in itself rather vague) there are indeed facets that have matured sufficiently to allow for common standards.

The standards we have shaped or are contributing to serve two purposes. On the one hand, standards - as said above - help achieving interoperability. Equally important in practice, they condense best practices (for example, in serving massive spatio-temporal geo data) and thereby give guidance to implementers.

Our group, represented by its group head Peter Baumann, is actively shaping Big Data related standards on massive multi-dimensional arrays in ISO SQL (sometimes nicknamed "Science SQL" - which is not quite correct, as arrays occur also in engineering and business data) and scalable geo services, such as the WCPS geo-raster filtering and processing language.

Standards developed by our group have high industry impact, they are implemented by both commercial vendors (such as ESRI, P) and open-source projects (like MapServer and GeoServer).

ISO

Functions:

Output:

Open Geospatial Consortium

Functions:

Output are spatio-temporal geo datacube standards in particular, but also further results beyond (see OGC coverage standards showcase illustrating n-D geo raster standards, based on rasdaman):

Further Standardization and Collaboration Bodies

In addition, the group head, Peter Baumann, is engaged in several further bodies: