Blood Pressure Endpoints for Clinical Trials: Are You Monitoring?

Submitted by Jeffrey Heilbraun on October 8, 2013

This week, I will be hosting a webinar focused on considerations and best practices for Blood Pressure (BP) monitoring as part of the cardiac safety assessment for compounds in development. The evaluation of BP responses to drugs being developed for non-cardiovascular indications is garnering increased public awareness and regulatory focus, evidenced by formal scientific discussions at prominent meetings and recent publications by the Cardiac Safety Research Consortium (CSRC) on this topic.

When designing a study aimed at measuring the “off-target” BP effect of a compound, there are a number of factors to keep in mind. Here are 3 key considerations for accurately defining the off-target BP effect and maximizing the potential of your blood pressure cardiac safety study.

  1. Do changes in blood pressure relate to compound concentration?
    It is important to determine whether an off-target BP signal is associated with increasing drug concentrations or if it is independent of drug concentration. Determining whether the observed change in BP (as well as other safety and efficacy measures) shows dose dependency with your compound provides valuable clinical information, including delineating a specific concentration threshold that is associated with changes in BP. Evaluating the concentration effect on BP signal within a SAD or MAD study in the early phase development of a compound can be beneficial prior to moving into a phase II, patient-based population

 

  1. Blood pressure signals for short and long term exposure to a therapeutic compound.
    Establishing a comprehensive BP profile of a compound provides valuable information from a clinical management and regulatory perspective.  When evaluating a study compound, it is important to determine if the BP response reaches a plateau or continues to increase as a function of extended exposure.  Furthermore, from a safety perspective it may be important to understand what occurs to the BP upon cessation of drug treatment (i.e. does the BP return to baseline or to a clinically appropriate threshold?).Data regarding the fluctuations of BP signals is important as it informs the sponsor whether changes in BP are resolvable or if additional intervention (in the form of secondary medications) is required to safely use the compound. This can become an important consideration based on the therapeutic indication and whether or not the medication is taken on a long term basis, short term basis or intermittently when disease symptoms are present.

 

  1. What are the available options for evaluating the blood pressure changes?
    When it comes to monitoring BP, there are several options for the sponsor to consider based on a number of parameters including clinical study phase, patient population, therapeutic indication and an understanding of BP response profiles obtained during early compound development. Advances in BP technology enable efficient capture of BP measurements in both the clinic and office setting as well as directly from a patient’s home. Pilot studies of BP measurements are often used as a guide for selecting an approach.Common BP measurement and monitoring options include:
  • 24-hour Ambulatory Blood Pressure Monitoring. ABPM is a key diagnostic technology, providing surrogate endpoint data describing BP changes over 24 hour time periods. Constant monitoring enables evaluation across circadian rhythms, providing a more comprehensive analysis of BP changes. ABPM data are now included in a number of regulatory submissions for novel drugs and are increasingly considered for determining new drug safety during regulatory review.

 

  • Automated office blood pressure monitoring. The benefits of using automated BP monitoring include calibration and standardization of equipment across study sites and removal of variability resulting from human/auscultatory bias.

 

  • Remote home blood pressure monitoring. Telemonitoring provides a means of electronically transferring study participant, self-monitored BP data to a central repository, reducing the number of patient visits and increasing patient compliance for a study. This methodology is valuable for providing real time visibility into blood pressure trends during the conduct of the study.

Be sure to join me on Oct 10th to further explore best practices for BP monitoring in clinical trials and gain perspective from a pharmaceutical sponsor who has been directly involved in addressing and navigating approaches to defining an off-target BP signal for a developing compound.

Link to original press release: http://www.bioclinica.com/blog/blood-pressure-endpoints-clinical-trials-are-you-monitoring

Medidata Launches Novel Study of Technology-Enabled Patient Engagement on Changing Health Outcomes in Diabetes Community

First Medidata-Sponsored Clinical Trial Will Use mHealth Devices Linked to Cloud Technology for Lifestyle Management

NEW YORK, N.Y. – September 19, 2013 – Medidata Solutions (NASDAQ: MDSO), the leading global provider of cloud-based solutions for clinical research in life sciences, today announced the sponsorship of its first clinical trial, which will evaluate the impact of mobile and cloud-based technology on patient engagement for improved health outcomes in the diabetes community.

By providing patients with the tools they need to track behavioral factors such as weight and physical activity, the pioneering study will test whether technology can increase the rates of therapy adoption and drive better outcomes for people with diabetes. Medidata’s study will use mHealth devices that provide immediate feedback to participants and are wirelessly linked to targeted messaging for unique patient engagement.

The study is in collaboration with Medidata Technology Partner Spaulding Clinical Research, a leading clinical research solution provider and medical device manufacturer, which will use its capabilities to enable the mHealth devices to connect to the Medidata Clinical Cloud™.Withings, a top innovator in health and wellness smart devices and applications, will supply the mHealth devices that will monitor the physical activity and weight of study participants. Leading endocrinologist Dr. Zachary T. Bloomgarden, Clinical Professor of Medicine, Endocrinology, Diabetes and Bone Disease at Mount Sinai Hospital in New York and co-editor of the Journal of Diabetes, is the study’s principal investigator. The initial feasibility stage of the study is slated to start in the fourth quarter of 2013 and will be followed by a randomized clinical trial.

“We are proud to sponsor our first clinical study in collaboration with Spaulding and Withings to see how disruptive technology can improve patient engagement and actually lead to better medical outcomes,” said Glen de Vries, president of Medidata Solutions. “Since our beginning, Medidata has pioneered the use of technology to transform clinical research, and the opportunity to see our technology used to improve patient health in the critical area of diabetes care is a logical—and incredibly exciting—extension.”

More than 20 million Americans have been diagnosed as having diabetes, the number nearly doubling in the past decade due to factors including lack of physical activity and obesity, resulting in significant individual and societal impacts and costs. Through activity monitoring devices, uploaded data and targeted feedback, the Medidata-sponsored study will explore the potential of technology tools to improve overall levels of patient adherence to lifestyle changes. The feedback provided to patients will address exercise levels as well as diet and regular use of recommended medications.

Randy Spaulding, founder and CEO of Spaulding Clinical Research, said, “Wireless and personal mobile devices provide opportunities to improve patient engagement because of their ease of use, real-time transmission of data and increased portability and convenience. This collaboration is an important step in bringing these benefits to the real world.”

“One of the major challenges in diabetes management is working with patients to adopt lifestyle changes,” added Dr. Bloomgarden. “Most diabetics have to live with this chronic condition for a long time, so using personal devices and patient engagement apps to improve quality of life would be very powerful and a huge win for our clinical care models.”

Link to original Press Release: http://www.mdsol.com/medidata-launches-novel-study-technology-enabled-patient-engagement-changing-health-outcomes-diabetes-community

5 Things Every Data Manager Should Know When It Comes to Medical Images in Clinical Trials

Submitted by Colin Miller on September 9, 2013

This week the annual Society for Clinical Data Management (SCDM) conference, the world’s largest educational event for clinical data managers and related professionals, will be held in Chicago. I have been invited to give a talk aimed at clinical data managers, highlighting some of the complexities and challenges associated with trials containing imaging endpoints.

While preparing for my presentation and thinking about the challenges that are faced by data managers, I put together the following checklist of what I consider best practices for managing and processing imaging data in clinical trials.

1. Understand the Imaging Review Charter

Clinical trials with imaging endpoints require an Imaging Review Charter (IRC). The IRC serves as a roadmap for standardizing and interpreting data coming from trials containing imaging endpoints, providing a comprehensive and detailed description of the clinical trial imaging methodology. It is important to ensure that the Charter and export specifications document match with respect to the primary endpoints.  Although additional data or assessments not described in the Charter may be exported, the key assessments have to match the content of the Charter. An understanding of the IRC will help ensure that a data manager is in tune with the overall flow of data for the trial and is up to speed with all imaging data that will fall under their supervision.

2. Understand your imaging data

Data managers should be familiar with the imaging endpoint(s) being measured in a given study. Different endpoints provide different data outputs. Some data are quantitative at the time of acquisition (e.g. PET and DXA scans) while other data are derived from image measurements (e.g. lesion area or volume). Another type of data output are scoring systems, which are commonly used in many therapeutic areas (Genant for osteoporosis or Sharp modified for RA) and provide semi-quantitative data. Familiarizing yourself with the measurements feeding into an eCRF is crucial to understanding and validating data and will facilitate the development of appropriate edit checks for a study.

3. Develop the edit checks being applied to your imaging data early in the process

Developing optimum edit checks for each imaging endpoint is important to ensure high quality data. Different imaging endpoints will require different edit checks due to the inherent variability of different modalities and measurements.  Longitudinal studies (e.g. lesion tracking for oncology studies) provide multiple measurements and track differences over time, making it necessary to understand the extent of variability which can be tolerated in a given measurement. Imaging core labs are often tasked with performing edit checks, so it is critical for the data manager to understand these edit checks and the rationale behind choosing them.

4. Understand the read design

The read design for a clinical trial ultimately dictates the imaging data that will pass through the hands of the data manager. Different clinical trials utilize different reader paradigms, from a relatively straight-forward single reader to more complex paired reads with adjudication. The choice of read paradigm is based on a number of factors including study phase, regulatory compliance, operational efficiency, and cost-benefit. By understanding the selected reader paradigm, data managers can understand the flow of data in a trial and can anticipate the amount of data he/she will be handling throughout the course of the trial.

5. Visit the core lab

Although this may sound obvious, I encourage all data managers working on an outsourced clinical trial to establish a relationship with the vendor of your study. For clinical trials in which an imaging core lab is utilized for centralized image analysis, it is important to be involved in communications with the core lab from the start of the trial. Visiting the core lab and participating in conference calls between the sponsor and core lab are good ways to ensure open lines of communication during the course of the trial.

With medical imaging playing a prominent role in today’s clinical trials, data managers must be aware of the challenges associated with managing complex imaging data. The SCDM is an important conference and I’m looking forward to sharing my thoughts on this topic at the meeting. I hope to see you there!

Link to original post: http://www.bioclinica.com/blog/5-things-every-data-manager-should-know-when-it-comes-medical-images-clinical-trials

5 Strategies for Speed and Quality Data in High Volume Clinical Trials

Submitted by Jennifer Kelly on September 6, 2013

Are you a clinical trial data manager concerned that any of the following pose a threat to your data quality?

  • A clinical trial with a large patient population
  • Reconciliation of multiple data points
  • Aggressive timelines and massive amounts of data

If you’re wrestling with these issues, you’re not alone. Maintaining quality along with speed and volume is a real balancing act for data managers — and it’s becoming increasingly more difficult as the industry is challenged to hold down rising costs while accelerating therapies to market.

Learn strategies to achieve both quantity and quality data in large-scale clinical trials, including those with thousands of subjects, during the annual SCDM Conference in Chicago. On September 12th, I will be presenting “Key Steps in Maximizing the Production of Quality Data in High Volume Clinical Trials.”

You’ll find strategies in all areas of clinical data management, including:

  • eCRF Development and Design – Learn how CDASH requirements can be a big time-saver, plus how to decrease user error and queries by simplifying your forms.
  • System Functionality – See how derivation and work flow edits can reduce data entry errors and cleaning time by effectively using enable/disable and hard trap.
  • Data Review – Hear how you can simplify data review on even the most complicated eCRFs using the MedRA and WHO Drug Dictionary. See how to track unacceptable query responses and how to maximize reviewer time by changing the way forms are assigned.
  • Ancillary Data Integration– Discover what vendor data is really needed to import for data cleaning.
  • Quality Control – Learn the best time to perform QC data review and how often it should be done.

Link to original post: http://www.bioclinica.com/blog/5-strategies-speed-and-quality-data-high-volume-clinical-trials

Medidata Solutions Announces Exercise in Full of Overallotment Option for 1.00% Convertible Senior Notes Due 2018

NEW YORK, N.Y. – August 8, 2013 – Medidata Solutions, Inc. (NASDAQ:  MDSO) today announced that the initial purchasers for the previously announced private placement of $250 million aggregate principal amount of Medidata’s 1.00% Convertible Senior Notes due 2018 (the “Notes”) to qualified institutional buyers pursuant to Rule 144A under the Securities Act of 1933, as amended (the “Act”), have elected to fully exercise their overallotment option to purchase an additional $37.5 million aggregate principal amount of the Notes. With the exercise of the overallotment option, a total of $287.5 million aggregate principal amount of the Notes will be sold at the closing of the offering, which is expected to occur on August 12, 2013, subject to customary closing conditions.

Medidata intends to use the net proceeds from the offering for working capital and other general corporate purposes, including to fund possible acquisitions of, or investments in, complementary businesses, products, services, technologies and capital expenditures. Medidata has not entered into any agreements or commitments with respect to any acquisitions or investments at this time.

Neither the Notes nor the shares of Medidata’s common stock issuable upon conversion of the Notes, if any, have been registered under the Act or the securities laws of any other jurisdiction and, unless so registered, may not be offered or sold in the United States absent registration or an applicable exemption from such registration requirements.

This announcement is neither an offer to sell nor a solicitation of an offer to buy any of these securities and shall not constitute an offer, solicitation or sale in any jurisdiction in which such offer, solicitation or sale is unlawful. Any offers of the securities will be made only by means of a private offering memorandum pursuant to Rule 144A under the Act.

Link to original Press Release: http://www.mdsol.com/press/Medidata-Solutions-Announces-Exercise-Full-Overallotment-Option-for-1-percent-Convertible-Senior-Notes-Due-2018

Top Global Pharma Daiichi Sankyo Extends Use of the Medidata Clinical Cloud to China

Leading Pharma Looks to Medidata Solutions to Bring Operational Efficiencies to Clinical Trials in China

NEW YORK, N.Y. – July 31, 2013 – Daiichi Sankyo Co., Ltd., a top 20 pharmaceutical company headquartered in Japan, is expanding its use of Medidata Solutions’  (NASDAQ: MDSO) cloud-based platform to support clinical trials conducted by its division in China. A long-time Medidata customer, Daiichi Sankyo will bring Medidata’s industry-leading applications for electronic data capture (EDC) and clinical data management (CDM) and randomization and trial supply management (RTSM) to its expanding clinical work in China. Daiichi Sankyo’s investment in the Medidata Clinical Cloud™ for its China division is expected to streamline trial activities, improve the efficiency of data capture and increase the productivity of site users and clinical monitors.

By adopting Medidata’s industry-leading data capture, management and reporting solution (Medidata Rave®) in conjunction with its agile randomization and trial supply management solution (Medidata Balance®), Daiichi Sankyo will have access to a single cloud-based platform that provides a unified environment in which site staff will conduct patient randomization, supply dispensing and clinical data capture, as well as enabling collaboration and insightful metrics visibility for the whole research team.

Daiichi Sankyo chose the Medidata platform for its expansion in China based on the improvements realized from its large-scale use of the Medidata platform since 2005. Medidata Services Partner Tigermed, a leading contract research organization (CRO), was involved in the selection and implementation of the Medidata platform for the China-based work.

“Medidata and Daiichi Sankyo have a long and fruitful relationship as collaborators, and we are delighted they have asked us to provide the efficient, effective and modern infrastructure needed for their clinical development programs in China,” said Glen de Vries, president, Medidata Solutions. “Organizations like Daiichi Sankyo that leverage innovative clinical technologies will be the leaders in bringing new life-enhancing treatments to market.”

Original link to press release: http://www.mdsol.com/press/daiichi-sankyo-extends-use-of-the-medidata-clinical-cloud-to-china

Grünenthal Expands Use of BioClinica’s Next Generation Trident IWR/IVR for Global Trial

– Leading Pain Management Pharmaceutical Utilizes Next Generation IWR/IVR Solution –

BioClinica®, Inc., a global provider of clinical trial management solutions, announced today that Grünenthal GmbH will again utilize the latest enhancements in Trident IWR/IVR for a global clinical study on pain medication. The three-year study will involve 350 patients at 80 sites across 17 countries. Trident streamlines the clinical trials process, making it faster and easier to set up, test, and deploy clinical study protocols.

The integration of Trident into Grünenthal’s clinical trial process is already underway. The new interactive web and voice response technology replaces a manual system for managing inventory, returns, and accountability of controlled substances. Trident automates these tasks and makes it easier to stay in compliance with controlled substance regulations, an especially important consideration for a pain management specialist.

In evaluating IVRs, no other technology provider met the controlled substance functionality Grünenthal needed. BioClinica responded by developing a customized solution. “BioClinica listened to our needs and built the functionality we needed into their IRT system in time for our deadline,” said Henk Dieteren, Associate Director, Head Clinical Supply Manager, Compound Development and Branding, Clinical Development Operations, Clinical Trial Supply.  “We envision IRT systems becoming an integral part of our controlled substance studies,” Dieteren added.”Trident’s ability to manage controlled substance inventory is crucial to meet strict regulatory requirements for conducting clinical trials that involve investigative compounds related to pain management.”

The selection of Trident expands Grünenthal’s use of BioClinica solutions, specifically OnPoint CTMS. In March 2012, the pharmaceutical company penned an agreement to use the customizable clinical trial management system enterprise-wide for gathering and sharing clinical trials information. Grünenthal made the selection recognizing OnPoint CTMS could help further innovation and development of new pain therapies.

“We are very pleased Grünenthal is expanding its use of BioClinica products to include Trident,” said Peter Benton, BioClinica’s President of eClinical Solutions. “The BioClinica suite of eClinical solutions is designed especially for this kind of integration, and will help Grünenthal manage its future clinical trials more efficiently.”

Trident reduces the average time it takes to write, set up, and validate IWR and IVR protocols from months to weeks. Trident supports all of a sponsor’s clinical studies within a standardized data model that better supports automatic drug pooling and reporting.  By giving clinical trial sponsors the ability to monitor and maintain all their study protocols in one place, it is more efficient and cost-effective than developing specifications, programming and validating a new IWR system for each new study.

BioClinica is offering a free webinar on the subject of controlled substance studies titled Controlled Substance Studies: Meeting the Supply Management Challenge on August 8th, 2013 at 11:00 AM Eastern Time. BioClinica representatives will also be available to demonstrate the company’s leading eClinical solutions at the Drug Information Association annual meeting June 24th through 26th in Boston, Massachusetts in Booth #1210.

Follow BioClinica on the Trial Blazers blog at http://info.bioclinica.com/blog, and on Twitter at http://twitter.com/bioclinica.

Clinical Systems Implementation – Five Things You Need to Do!

Recently I wrote a blog about the steps required to successfully select a clinical systems vendor that will truly deliver against your needs.  To review: write a business case, collect detailed requirements, write a clear RFI, quickly narrow down your list of candidates, host some dog & pony shows, and finally make a decision. Whew!  That’s a lot, and each step is a lot more difficult and involved than it sounds.  But that’s the case with just about everything in life really, so I’m sure that being the smart and determined blog reader/system selector that you are, you managed to get through them all.  Nice job!  By now however you probably realized that systems selection is just the tip of the iceberg.  Now that your company has dropped a good chunk of change because you wrote that nice business case explaining why buying the system was such a great idea, you have to make sure that people actually, you know, use it.  Clinical systems implementation can be a tricky business.  All the business cases in the world won’t help if the system doesn’t get rolled out properly, and even the most perfect of systems won’t be of any benefit if implemented in a vacuum.

Spare Some Change (Management)

Any time there are changes, there are going to be problems.  People don’t like change.  The lack of a change management strategy is one of the biggest reasons these shiny new systems fail to deliver on what they promise.  In short, management needs to communicate the changes well before they start happening so people can prepare themselves.  Really what needs to happen is essentially a marketing campaign – first teasing the upcoming changes (and the new system), then some ‘benefits management’ (letting people know what’s in it for them…you still have that business case, right?  Time for a little copy/paste…), setting expectations for future shifts in the way things work, a big launch that’s given some degree of fanfare, and a follow up campaign to reinforce the new order of things (and to remind people how great everything is working out).  All of this needs to be scaled to match with the impact that the new system will have on the organization – be careful not to overdo (or underdo) it!

People Power

While this is covered to a degree in the change management piece, it goes a lot deeper than that.  First, you need to find the system a home – who is going to ‘own’ it?  You also need to make sure that every layer of the organization that will be affected by the new system is addressed in a more ‘personal’ manner.  Have conversations with individuals to get a better understanding of their concerns and perhaps more importantly get an idea of their expectations of the new system.  Hopefully you’ve already addressed everything way back when you collected requirements, but things change and people come up with new ideas as they see for themselves more of what the system can do.  Regardless, you need to make sure that what the people need (a good set of reports, for example) will be available from the system at launch, or you’re setting yourself up to fail.

At a higher level, some organizational changes may be necessary to get everything out of your new system that you want.  For example, before the new system you may have had people doing manual data entry as half of their jobs.  But the new system uses fully automated data feeds – what are you going to do with all of that free time now?  Or the opposite might be true, where you need some kind of system administrator or business analyst permanently attached to the new system.  Make sure you plan ahead!

Process is Paramount

A new system will almost always require a new set of processes.  After all, the reason you’re getting a new system is that it’s better (and therefore different) than your old system.  But it’s not just the actual interaction with the new system that will need to be looked at.  This is a great time to step back and evaluate your overall processes in the area that the new system is a part of (and you’re already of a continuous improvement mindset anyway, right?).  This is a great way to “market” your new system – because let’s face it, the old system wasn’t that bad.  Usually the problems are at least 50% process oriented, if not more.  By revamping your processes you can not only make them more efficient but they can be designed in such a way as to dovetail nicely with the new system.  When you are issuing your communications throughout the system selection/implementation project, this becomes a much more powerful message than merely touting a new system.  “Hey, we’re overhauling everything so your lives are about to get way better!”

Testing Users’ Patience

There’s nothing worse than rolling out a system that people find to be buggy and unstable.  Even if the issues are minor, a bad first experience can leave a bad taste in people’s mouths for a long time (and perhaps forever).  Doing thorough system testing (especially if your system has been configured or customized) is critical.  User acceptance testing will also help to get a greater number of future users involved to not only spot technical issues but to identify general deficiencies in the system before it gets rolled out to the population at large.  Testing is painful but necessary if you are to release a quality “product” to your organization.

Systematic System Training

Even the most simple of systems will require some training, and there’s nothing better than some hands-on sessions where users get their hands dirty.  Just keep the extent of the training in line with the complexity of the system – don’t overcomplicate things.  Further, training should be focused for each job role as there’s no reason that a person in one job needs to know every little thing a person in a completely different job does in the system (other than at the highest of levels).  A training plan should be developed well before system roll out so everyone knows what to expect and blocks off time for training on their calendars.  But training doesn’t end with the actual training – users will also need access to detailed documentation for refreshers, and creating other aids such as “cheat sheets” and FAQs can go a long way in making sure people can easily put your new system to good use.

By now you’re probably regretting purchasing this new system.  Whose idea was this anyway?  Take a deep breath and fear not, for with some planning and oversight (and a good fully dedicated project manager) your system can experience a smooth roll out and significant user adoption.  Which means you did such a good job with this one that you’ll be the one put in charge of the next one…way to go!

Link to original post: http://www.pharmicaconsulting.com/clinical-systems-implementation-five-things-you-need-to-do/

Perceptive Informatics Improves Clinical Trial Management with New Release of Impact® CTMS Solution

Boston, MA, U.S., April 3, 2013 — Perceptive Informatics, a leading eClinical solutions provider and a subsidiary of PAREXEL International Corporation (NASDAQ: PRXL), today released an enhanced version of its comprehensive IMPACT® clinical trial management system (CTMS). Notable features include improved data entry as well as site management and monitoring capabilities. The system is designed for pharmaceutical and biotechnology companies of all sizes, and has more than 26,000 users in studies of up to tens of thousands of subjects and millions of visits.

According to market research firm MarketsandMarkets, the CTMS software industry is expected to jump from $567.2 million in 2010 to $1.3 billion by 2016.[1]  Perceptive Informatics’ CTMS solution provides a user-friendly infrastructure that enables clinicians to manage trials of varying complexity.

“Many top global pharmaceutical companies and CROs rely on the IMPACT CTMS solution to plan, administer and track every aspect of their clinical trials,” said Nick Richards, Vice President, Product Management, Perceptive Informatics. “As clients face further globalization of clinical trials and require site management and monitoring of increasingly complex studies, they need technology that keeps pace. Perceptive’s enhanced system represents a significant leap forward in flexibility and usability, enabling clients to improve the speed and effectiveness of their clinical development programs.”

Perceptive’s enhanced version is a customizable system which allows managers the flexibility to define which information is most important to their clinical trial and configure it accordingly. Highlights include:

  • Configurability by Clinical Trial Type – Customers can define their own clinical trial types – from single-site investigator-initiated studies to large multi-country studies – enabling them to select the critical screens and fields that each requires. This simplifies and streamlines clinical trial management and ensures that users see exactly what they need.
  • Assistance for Field Monitors – The IMPACT® MySites™ Module, which supports online and offline monitoring activities and the collection of associated management data, now enables contract research associates to record and manage contacts for the sites they monitor, allowing them to have a more comprehensive view of site activity.
  • Improved Data Mining Capabilities – The IMPACT® Investigator Module’s advanced search feature enables users to select physician names based on their location, specialty, therapeutic interests, responses to qualification questionnaires and investigator performance metrics in order to speed the finalization of candidate study sites and the recruitment of optimal investigators for a trial.

Perceptive Informatics’ IMPACT CTMS solution is used to plan, track and report on clinical trials and is available as a hosted, software-as-a-service (SaaS) application. It is an integral part of the Perceptive MyTrialsTM platform, which provides an application framework to converge its integrated suite of clinical trial software. For more information visit: http://www.perceptive.com/ctms/impact

[1] MarketsandMarkets, “Clinical Trial Management Systems (CTMS) Market: Global Trends, Opportunities, Challenge and Forecasts (2011 – 2016),” January 2012.

– See more at: http://www.perceptive.com/news/press-releases/2013/perceptive-informatics-improves-clinical-trial-management-with-n/#sthash.fSP9Faxy.dpuf