High performance computing for high content screening – A case study
Using today’s data analysis systems, researchers conducting phenotypic screening campaigns at pharmaceutical companies processing approximately 500,000 compounds, estimate an image and data analysis time of at least three months.
Furthermore, multiple disparate software systems are used at various stages of the workflow including image analysis, cell level data analysis, well level data analysis, hit stratification, multivariate/machine learning data analysis and visualisation, reporting, collaboration and persistence.
In this webinar, PerkinElmer and AMRI will present a case study wherein High-Performance Computing (HPC) was leveraged for ultimate performance in image and data analysis of High Content Screening experiments.
Learn how to:
- Complete Batch re-analysis jobs in days
- Complete Clustering and other machine learning methods in minutes
- Balance flexibility, automation, and scalability for large and small organisations
The rest of this content is restricted - login or subscribe free to access
Thank you for visiting our website. To access this content in full you'll need to login. It's completely free to subscribe, and in less than a minute you can continue reading. If you've already subscribed, great - just login.
Why subscribe? Join our growing community of thousands of industry professionals and gain access to:
- quarterly issues in print and/or digital format
- case studies, whitepapers, webinars and industry-leading content
- breaking news and features
- our extensive online archive of thousands of articles and years of past issues
- ...And it's all free!
Click here to Subscribe today Login here
Related topics
Analysis, Hit-to-Lead, Imaging, Informatics, Screening
Related organisations
PerkinElmer