Your First Llama Stack Application
Understand how pre-configured Llama Stack distributions work, how to interact with the system via Python code, and how to extend this setup for more complex tasks down the line.
We'll cover the following...
- The purpose of a quick-start distribution
- Step 1: Set up your Together AI account
- Step 2: Set up environment variables
- Step 3: Launch the Llama Stack server
- Step 4: Create your first application script
- Bonus step: Using the Together AI API
- Using LlamaStackAsLibraryClient
- Understanding the unified API interface
- Switching providers
- Complete code
- Final thoughts
After setting up our development environment locally, we’re now ready to run our first real application. We’ll build a simple script that connects to a remote model hosted by