Size Limitations
You can use this reference page to learn how file upload and service size limits affect how the Tetra Data Platform (TDP) receives and processes data.
The Tetra Data Platform supports the limits listed in these tables for:
- File Upload
- Services
File Upload Limits
File Upload | Limit | Notes |
---|---|---|
Manual file upload through the TDP UI | 200 MB | |
File upload to the Tetra File-Log Agent through a GDC connector | 500 MB | |
File upload to the Tetra File-Log Agent through a Direct S3 | N/A | Multipart using a part size of 500 MB |
File upload to all other Tetra Agents through a GDC connector | 500 MB | |
File upload to all other Tetra Agents through a Direct S3 | Direct S3 | Multipart using a part size of 500 MB |
Data Hub Connectors and Cloud Connectors (such as: Egnyte, Box, Cellario, SDC, and Solace) | N/A | - Size limit is set by the source system - For large files, multipart is used with a variable part size |
Service Limits
Service | Limit | Notes |
---|---|---|
AWS Elasticsearch/OpenSearch | 100 MB | 100 MB of indexable content per document |
AWS S3 Metadata and Tags | 2 KB | Sum of custom metadata and tags must be < 2 KB |
TetraScience TaskScript | 180 GB | TaskScript (input file) as part of the pipeline protocol |
TetraScience Data Pipelines - Python | 5 GB | |
TetraScience Data Pipelines - Node.js | N/A | Multipart is used |
Updated 4 months ago