We're a collective of students from Hopkins School of Public Health writing about Public Health and AI.

BERT and the Power of Transfer Learning in NLP

BERT and the Power of Transfer Learning in NLP

Discover how BERT (Bidirectional Encoder Representations from Transformers) revolutionized NLP by learning deep contextual relationships, and how transfer learning allows us to leverage its power for custom tasks.

The Transformer Architecture: The Model That Changed NLP Forever

The Transformer Architecture: The Model That Changed NLP Forever

An exploration of the Transformer architecture and its core component, the self-attention mechanism, which has become the foundation for modern large language models like GPT and BERT.

Long Short-Term Memory (LSTM): Overcoming RNNs' Limitations

Long Short-Term Memory (LSTM): Overcoming RNNs' Limitations

Dive into Long Short-Term Memory (LSTM) networks, a special kind of RNN that can learn long-term dependencies, revolutionizing natural language processing and time-series analysis.

Recurrent Neural Networks (RNNs): Understanding Sequential Data

Recurrent Neural Networks (RNNs): Understanding Sequential Data

An introduction to Recurrent Neural Networks (RNNs), the models that give machines a sense of memory, making them ideal for tasks like translation, speech recognition, and more.

Convolutional Neural Networks (CNNs): The Eyes of Deep Learning

Convolutional Neural Networks (CNNs): The Eyes of Deep Learning

A deep dive into Convolutional Neural Networks (CNNs), the powerhouse behind modern computer vision. Learn how they 'see' and classify images with incredible accuracy.

Demystifying Backpropagation: The Core of Neural Network Training

Demystifying Backpropagation: The Core of Neural Network Training

A beginner-friendly guide to understanding backpropagation, the fundamental algorithm that powers deep learning. We'll break down the concepts and provide a practical code example.