Artificial intelligence-based technologies for biophotonic data

For decades, biophotonic technologies have been booming in various fields of sciences. These technologies reveal not only structural but also molecular and functional changes in the sample under investigation. Additionally, they have prominent advantages such as high molecular sensitivity, high usability, high compactness, and high spatial and temporal resolution. Due to these advantages, biophotonic technologies have great potential in clinical applications. Nowadays, researchers emphasize the use of biophotonic technologies for point-of-care testing in clinics and the in vivo imaging of live cells for automating the disease diagnosis workflow. Furthermore, researchers are also focusing on integrating multiple biophotonic technologies in a single unit for understanding diseases at the cellular, molecular, and tissue level. Such ever-increasing developments in biophotonic technologies result in a massive amount of biophotonic data, and analysis of large biophotonic data by a human being is challenging. Therefore, algorithms that can automatically analyze biophotonic data to extract useful "patterns" like an experienced person are crucial. Extracting patterns from data using algorithms which can imitate human intelligence by learning from the data itself is categorized into a field of "artificial intelligence" (AI). Utilizing AI to analyze data from biophotonic technologies like Raman spectroscopy, coherent anti-Stokes Raman scattering (CARS) microscopy, two-photon excitation fluorescence (TPEF) microscopy, and second-harmonic generation (SHG) microscopy is the main highlight of this thesis. Concisely, this thesis will use AI and biophotonic data for biomedical applications like the prediction of disease, segmentation of various regions in tissue, and transformation of one modality into another modality. The results in this thesis will show that utilizing AI, along with biophotonic technologies, can benefit the field of biomedicine and the life sciences.



Citation style:
Could not load citation form.


Use and reproduction:
All rights reserved