Product Security
Data
Data-at-Rest Encryption
The files stored within the Data Lake S3 bucket are stored encrypted leveraging AWS managed Key Management Services (KMS) using 256-bit Advanced Encryption Standard (AES-256). AWS secrets manager is leveraged to store credentials for external Application Program Interfaces (APIs) and AWS Relational Database Service (RDS) access.
Data-in-Transit Encryption
Data in transit is encrypted by using HTTPS, Transport Layer Security (TLS) 1.2, and a 256-bit encryption key.
For Tetra Data Hub network access requirements, see Tetra Data Hub Allow List Endpoints.
Record Preservation
Data gathered from customers is regulated under 21CFR Part 11, and should have protection against being altered or deleted.
Application
Validation Controls
All data is validated using standard libraries for parameter validation for specific content. Node.js library secure library is leveraged with buffer size protections, connection attack protection to refuse connections beyond a certain level of load.
Secure Delivery
The Tetra Web Application is configured to use Content Security Policy to ensure only trusted javascript is loaded.
HTTP security headers are enabled: Strict-Transport-Security to enforce the use of HTTPS; X-Frame-Options is set to deny so that the web application cannot be loaded in a frame for another website; and X-Content-Type-Operations is set to ‘nosniff’ to declare the browser should not change the declared content-type.
Penetration Testing
TetraScience contracts an independent firm on an annual basis to perform a penetration test of the Tetra Data Platform product itself.
Vulnerability Management
Packaged code is automatically scanned for vulnerabilities, with any high or critical vulnerabilities remediated, before being released.
Cloud Infrastructure
AWS Services
Security tooling is leveraged to assess, audit and evaluate the configurations of various AWS services. Virtual Private Cloud (VPC) endpoints are used to establish communication with supported AWS services. Authentication is managed by IAM roles and credentials with least privilege considered for each role that gets assigned to the services.
Elastic Container Registry (AWS ECR) is not accessible publicly. The container images uploaded are scanned with AWS ECR Image Scanning with any high or critical vulnerabilities remediated before the container image is published.
TetraScience primarily utilizes AWS' Fargate ECR for compute with one notable exception: Windows worker EC2 nodes for C# task scripts. In this case, TetraScience leverages AWS' security best practices in using an AWS Windows AMI. TetraScience created a lambda function to continually poll AWS for a new Windows AMI. Upon receiving notice of a new AMI, TetraScience deploys the latest version.
Identity and Access Control
Single Sign-on (SSO)
The Tetra Data Platform web interface can leverage Identity Providers that comply with the industry-standard SAML 2.0 protocol. AWS Cognito is leveraged to integrate with the Identity Provider.
Tetra Data Platform does not store SSO credentials in the platform. TDP redirects authentication to Cognito which in turn redirects to your identity provider. TDP only stores email & id.
Customer responsibility: TetraScience strongly suggests our customers having Multi-Factor Authentication as part of their Single Sign On strategy.
Least Privilege Access
Authorization is accomplished using role based account access using the user’s attributes, managed and controlled with federated SSO.
During deployment, full AWS admin privileges are required for the user performing the deployment, and refer only the the services used by the platform.
All platform components have restrictive roles with the least level of privileges required to perform their functions.
Password Management
When SSO is not used, the Tetra Data Platform stores passwords in a SQL database using one-way (non-reversible) cryptographic hashing with a unique salt for each user.
One more area that passwords are stored is in the TDP Agent when using remote SMB file sharing to collect a file. The Agent stores the password using SHA-256 with a protected key.
Product Delivery Process
Peer Code Review
Code commits to the product are reviewed by peer engineers for quality assurance and security reviews.
Security Design Reviews
Major product changes and new features undergo specialized security reviews and risk assessment by qualified personnel early in the design phase, and as required throughout implementation.
Secure Software Development Lifecycle (SDLC)
Any new software development, including features and bug fixes, go through TetraScience’s SDLC process. This includes configuration management, leadership reviews, peer review of code commits, and deployment approval.
All code submitted must be reviewed by another engineer before being merged.
Code is then scanned by a static code vulnerability scanner, providing feedback to the engineer.
Included libraries are evaluated for known vulnerabilities.
Code is loaded into container images, which are scanned with AWS Image Scanning whenever there is a change to the container image.
Lastly, TetraScience engages an independent, outside party on an annual basis to perform penetration testing.
Quality Control Process
TetraScience performs continuous Quality Control as part of the overall development process, including leveraging a variety of automated testing practices (e.g.: unit tests) to ensure the software runs as expected and bugs are not reintroduced (e.g.: regression tests).
Monitoring
Active logging and monitoring is implemented using AWS CloudTrail and CloudWatch for the infrastructure. The Tetra Data Platform has audit logging enabled for user actions and logs are available for analysis through CloudWatch alerts.
Business Operations Process
Disaster Recovery
Customer data stored on TetraScience infrastructure is backed up in order to mitigate the impact of events such as:
- Accidental deletion by customer or employee
- Corruption of data due to software or human error
- General system failure
- Physical or environmental disaster at a data center site
- External attack resulting in stolen or ransomed data
In order to ensure timely and stable restoration of data, TetraScience backup and restore procedures are tested on a periodic basis. Data backups are protected with security equal or greater to that of the original system where said data was stored. Backup copies are retained for a sufficient period of time to ensure that data loss events can be mitigated even if they are not immediately discovered.
Customer data stored in cloud filestores such as S3 may be backed up using those system’s built-in versioning capabilities. Versioned systems record version history for each stored object, and offer the ability to restore that object to any previously saved state. Versioning is typically simpler to operate and more cost-efficient than full backup snapshots, especially for file objects that change less frequently.
Security Incident Management
Incident management responsibilities and procedures which cover all TetraScience services must be established and documented. These Incident Management Practices and Procedures are to ensure incidents can be resolved as quickly as possible, and business impacts are minimized. All third parties upon which TetraScience is dependent for the delivery of services and business continuity (e.g. cloud service providers) must have acceptable incident management procedures in place. At a minimum, the incident management procedures should include:
- A notification process
- Security incident data collection and analysis processes
- Security incident documentation requirements
- Internal and external communication requirements
- Response action procedures
- A process for post-mortem analysis
- A process to review sufficiency of controls and corrective actions.
All information gathered during incident and malfunction remediation activities are to be treated as TetraScience confidential information in accordance with Data Classification and Handling policies.
Updated 12 months ago