Propel a Digital Culture Shift by Devising Strategies to Better Obtain Data & Augment Its Quality
Time: 10:30 am
day: Pre-Conference Day
Details:
AI algorithms, particularly deep learning models, require large and high-quality datasets to make accurate predictions and generate meaningful insights. In drug discovery and preclinical development, obtaining comprehensive and well-annotated datasets can be challenging due to the complexity and diversity of biological systems. Moreover, experimental data can be expensive and time-consuming to generate, leading to limited data availability, especially for rare diseases or specific biological targets.
This workshop will gather experts to discuss
- Leveraging external data to accelerate sample gathering and attain a better understanding of disease pathogenesis
- How do we store huge volumes of data while minimizing costs?
- Increasing open source to improve the quality of data by allowing AI to keep up with the complexity of the biological system
- Building comparable data sets by developing highly robotic and automized procedures to enable all data sets to be generated in the same way
- Standardising the approach towards data acquisition and sourcing to enhance data integrity