🧪 Back in 2020, during my Masters thesis, I first discovered LLMs with BERT for tuned information retrieval in the COVID domain. Just wow how these transformer-based architectures served as catalysts for the dramatic evolution of LLMs making headlines today.
💪 Time for a new challenge to fuel my curiosity and growth! I’m committing to setting aside a minimum 1 hour a day for the next 100 days researching, playing with, and solving problems with LLMs, to ignite my imagination and upskill.
✨ The future belongs to continuous learners! Holding myself accountable, I’ll share my progress daily using #100DaysOfLargeLanguageModels on LinkedIn.
🔥 Looking forward to all the discoveries, hopefully inspiring others along the way.