1. INTRODUCTION
1.1 What is AI, ML, DL, Generative AI and Large Language Model
1.2 Lifecycle of Large Language Models
1.3 Whom This Book Is For
1.4 How This Book Is Organized
1.5 Source Code and Resources
2. PYTORCH BASICS AND MATH FUNDAMENTALS
2.1 Tensor and Vector
2.2 Tensor and Matrix
2.3 Dot Product
2.4 Softmax
2.5 Cross Entropy
2.6 GPU Support
2.7 Linear Transformation
2.8 Embedding
2.9 Neural Network
2.10 Bigram and N-gram Models
2.11 Greedy, Random Sampling and Beam
2.12 Rank of Matrices
2.13 Singular Value Decomposition (SVD)
2.14 Conclusion
3. TRANSFORMER
3.1 Dataset and Tokenization
3.2 Embedding
3.3 Positional Encoding
3.4 Layer Normalization
3.5 Feed Forward
3.6 Scaled Dot-Product Attention
3.7 Mask
3.8 Multi-Head Attention
3.9 Encoder Layer and Encoder
3.10 Decoder Layer and Decoder
3.11 Transformer
3.12 Training
3.13 Inference
3.14 Conclusion
4. PRE-TRAINING
4.1 Machine Translation
4.2 Dataset and Tokenization
4.3 Load Data in Batch
4.4 Pre-Training nn.Transformer Model
4.5 Inference
4.6 Popular Large Language Models
4.7 Computational Resources
4.8 Prompt Engineering and In-context Learning (ICL)
4.9 Prompt Engineering on FLAN-T5
4.10 Pipelines
4.11 Conclusion
5. FINE-TUNING
5.1 Fine-Tuning
5.2 Parameter Efficient Fine-tuning (PEFT)
5.3 Low-Rank Adaptation (LoRA)
5.4 Adapter
5.5 Prompt Tuning
5.6 Evaluation
5.7 Reinforcement Learning
5.8 Reinforcement Learning Human Feedback (RLHF)
5.9 Implementation of RLHF
5.10 Conclusion
6. DEPLOYMENT OF LLMS
6.1 Challenges and Considerations
6.2 Pre-Deployment Optimization
6.3 Security and Privacy
6.4 Deployment Architectures
6.5 Scalability and Load Balancing
6.6 Compliance and Ethics Review
6.7 Model Versioning and Updates
6.8 LLM-Powered Applications
6.9 Vector Database
6.10 LangChain
6.11 Chatbot, Example of LLM-Powered Application
6.12 WebUI, Example of LLM-Power Application
6.13 Future Trends and Challenges
6.14 Conclusion
INDEX
REFERENCES
ABOUT THE AUTHOR