The Tetra Data Platform

The Tetra Data Platform (TDP) is the life sciences industry’s only open, vendor-agnostic, collaborative platform purpose-built for scientific data. Built on Amazon Web Services (AWS) infrastructure, the TDP ingests raw/primary data from thousands of sources, replatforms it to the Tetra Scientific Data and AI Cloud and engineers it by extracting metadata and harmonizing it into a vendor-agnostic format that is liquid, large-scale, engineered, and compliant.

The TDP's event-driven architecture allows you to process data as soon as it's collected and provides pipeline monitoring capabilities that allow you to view the entire pipeline process. You can also use the TDP to perform SQL and Query DSL queries, either through the TDP user interface or the TetraScience API.

The TDP also provides GxP support to help ensure the capture of data provenance through a comprehensive audit trail, disaster recovery, control matrices, and software hazard analysis.

Key Data Activities

The TDP allows you to perform many tasks related to processing, viewing, and managing data. The following are a few examples:

  • Replatform data: Use Tetra Integrations to automatically collect and move scientific data between different instruments and applications while centralizing that data in the Tetra Scientific Data and AI Cloud.
  • Contextualize files: Add attributes to files to improve how retrievable they are by search. For example, you can use Tetra Data Pipelines to programmatically add information about samples, experiment names, and laboratories. You can add metadata to files through the TDP user interface or through the TetraScience API.
  • Harmonize files: Parse proprietary instrument output files into a vendor-neutral format with scientifically relevant metadata through the Intermediate Data Schema (IDS), while also storing the data in SQL tables.
  • Enrich files: Get information from other files within the Tetra Scientific Data and AI Cloud to augment new data.
  • Push data to third-party applications: Send data to an electronic lab notebook (ELN), laboratory information management system (LIMS), analytics application, or an AI/ML platform.