Contact Us

If you still have questions or prefer to get help directly from an agent, please submit a request.
We’ll get back to you as soon as possible.

Please fill out the contact form below and we will reply as soon as possible.

  • Contact Us
  • Home
  • Helpdesk
  • Other Resources
  • Castor EDC/CDMS resources

Continuous Application Performance Management (CAPM) in EDC/CDMS

Contact Us

If you still have questions or prefer to get help directly from an agent, please submit a request.
We’ll get back to you as soon as possible.

Please fill out the contact form below and we will reply as soon as possible.

  • CDMS
    Castor CDMS Manual Castor CDMS Calculations Manual Frequently Asked Questions Articles for Data Managers Castor CDMS Compliance Release Documents
  • eConsent
    Castor eConsent Manual Castor eConsent Compliance Release Documents
  • SMS
    Castor SMS Manual Castor SMS Compliance Release Documents
  • Castor Connect
    Castor Connect Compliance Release Documents Castor Connect Manual Castor Connect - Participant Quick Start Guide
  • Helpdesk
    News Other Resources Castor products knowledge resources
  • Status page
  • Completing a Study
+ More

Table of Contents

1. Human-centric discovery & iteration 2. Application Performance Monitoring (AMP)

Continuous application performance monitoring is a strategy we employ at Castor in order to ensure our users are getting the best experience out of our software. 

As study datasets grow, we continuously find ourselves re-assessing our data strategies in order to remain both flexible and performant. Our current approach can be described with the following two techniques: 

 

1. Human-centric discovery & iteration 

Due to the flexible nature of Castor EDC/CDMS data model, it’s often possible that performance concerns will arise in different areas of the application depending on the study structure. Because of this, we take a human-centric approach to prioritising performance improvements in order to tackle the biggest issues on a study-by-study basis. 

 

2. Application Performance Monitoring (AMP)

We use application performance monitoring tools to visualise our entire application infrastructure at a glance to identify bottlenecks, accelerate performance and focus on business results. Our engineering teams create service level indicators (SLIs) so that we can quickly understand the health of our applications both after making changes and as our clients’ datasets grow. Where necessary, we also attach a Service Level Objective (SLO) which results in alerts whenever one of our applications fall below the particular threshold.

Castor Service Levels:

 

 

Category Indicator Exclusions Objective Measured?
General Overall success rate. 
% of successful requests measured across all endpoints. :check_mark:
  >99.99% Yes
  Overall 95th percentile response time :check_mark: (measure also 50th and 99th percentile) Export request   Yes
  Apdex score [T=0.5s] as measured by New Relic, calculated over 4 weeks   >= 0.9 Yes
  Error rate (complementary of overall success rate, covers 200 HTTP response with success false) :check_mark:     Yes
Opening Studies % of successful requests opening a Study (2xx response) (StudyLoadController:loadStudyDataAction()) :check_mark:   >99.99% Yes
  95th percentile response time on opening Study requests :check_mark: (measure also 50th and 99th percentile)     Yes
  % of successful requests retrieves the list of studies list (2xx response) (StudyListController::listStudiesAction())   >99.9% No
External Services % of successfully delivered emails within application to Transactional Email Provider API   >99.9% No
  % of successful encryption calls (HTTP 2xx) with Encryption Provider (Google KMS)   >99.9% Yes
  % of successful upload/download requests (HTTP 2xx) with LFS buckets (Google Cloud Storage)   >99.9% Yes
Data Entry % of successful Create Record requests :check_mark:   >99.99% Yes
  95th percentile response time on Create Record requests :check_mark: (measure also 50th and 99th percentile)     Yes
  % of successful Data Entry requests :check_mark:   >99.99% Yes
  % of successful Randomization requests   >99.99% No
  % of successful Export requests   >99% No
  95th percentile response time on data entry requests :check_mark: (measure also 50th and 99th percentile)   <750 ms Yes
Queuing TBD      
Login % of successful Authentication/Login requests :check_mark:     Yes
  95th percentile response time on login requests :check_mark: (measure also 50th and 99th percentile)     Yes

 

 

If your study is showing abnormal loading times in certain areas you can submit a report to our support team by sending an email to support@castoredc.com. Please indicate the study id (can be found in the Settings) and describe which actions cause performance issues.
 

 

 

data management performance monitoring

Was this article helpful?

Yes
No
Give feedback about this article

Related Articles

  • Castor Academy walk-through
  • WHO Breakthrough cases Registry
  • Form Exchange walk-through
  • e-Learning
ISO 27001
FDA - 21 CFR part 11
ICH GCP compliant
HIPAA compliant
CDISC
ISO 9001
gdpr compliant

Products & Industries

  • Electronic Data Capture (EDC)
  • ePRO
  • eConsent
  • Decentralized Clinical Trials (DCT)
  • Clinical Data Management
  • Medical Device & Diagnostics
  • Biotech & Pharma
  • CROs
  • Academic Research

Resources

  • Thought Leadership
  • Blog
  • Castor Academy
  • Knowledge Base

 

Company

  • About Us
  • Careers
  • News
  • Contact Support
  • Contact Us

Legal & Compliance

  • Terms of Use
  • Privacy & Cookie Statement
  • Responsible Disclosure Policy
  • Good Clinical Practice (GCP)
  • ISO Compliance Certificates
  • GDPR & HIPAA Compliance
  • Security Statement

© 2022, Castor. All Rights Reserved.

Follow us on social media


Knowledge Base Software powered by Helpjuice

Definition by Author

0
0
Expand