Quick start
In this guide, we demonstrate how you can use the PhariaEngine SDK to build a simple Skill that performs a chat request. The full code can be found here.
Note that the PhariaEngine SDK is published on Aleph Alpha’s PyPI account.
Prerequisites
-
Python 3.11 or later (download Python)
-
uv - Python package installer (install uv)
1. Create a new project
In this guide, we create an example Skill that we call haiku, but you can use any name you want for your project.
Create a new project:
uv init haiku && cd haiku
Add the PhariaEngine SDK as a dependency:
uv add pharia-skill
2. Write a Skill
Now, we create the Skill. Create a file called haiku.py, and add the following code:
# haiku.py
from pharia_skill import Csi, Message, skill
from pydantic import BaseModel
class Input(BaseModel):
topic: str
class Output(BaseModel):
haiku: str
@skill
def generate_haiku(csi: Csi, input: Input) -> Output:
system = Message.system("You are a poet who strictly speaks in haikus.")
user = Message.user(input.topic)
response = csi.chat("llama-3.1-8b-instruct", [system, user])
return Output(haiku=response.message.content.strip())
Here we define the input and output type as Pydantic models. Then, we create our entry point by decorating a function with @skill. This function must adhere to the type signature in the example. The first argument
Csi provides the interface for interacting with the outside world, such as the chat request of our example.
3. Test
The @skill annotation does not modify the annotated function, which allows the test code to inject different variants of CSI.
The testing module provides two implementations of CSI for testing:
See the concepts for more information on differences between running Skills in PhariaEngine and locally.
To test against the DevCsi, we require two more environment variables:
# .env
PHARIA_AI_TOKEN=
PHARIA_KERNEL_ADDRESS=
Now, create a test_haiku.py file and add the following code:
# test_haiku.py
from haiku import generate_haiku, Input
from pharia_skill.testing import DevCsi
def test_haiku():
csi = DevCsi()
result = generate_haiku(csi, Input(topic="Oat milk"))
assert "creamy" in result.haiku or "white" in result.haiku
Install pytest:
uv add pytest --dev
Now run the test:
uv run pytest test_haiku.py
4. Build
You now build your Skill, which produces a haiku.wasm file:
uv run pharia-skill build haiku --no-interactive
Note that without the --no-interactive flag, you will be prompted to optionally publish the Skill.
5. Publish
We are ready to publish the Skill to a registry. Make sure you understand which namespaces are configured in your PhariaEngine instance and to which registries they are linked.
For the p-prod instance, we have set up a playground you can deploy to.
First, set the required environment variables:
# .env
SKILL_REGISTRY_USER=
SKILL_REGISTRY_TOKEN=
SKILL_REGISTRY=registry.gitlab.aleph-alpha.de
SKILL_REPOSITORY=engineering/pharia-kernel-playground/skills
To publish your Skill, run:
uv run pharia-skill publish haiku.wasm --name custom_name
6. Deploy
To know which Skills to serve, PhariaEngine watches a list of configured namespaces. These are typically toml files hosted on a server.
If you are deploying to the playground, simply update the namespace.toml file
in the GitLab UI and add your Skill:
# namespace.toml
skills = [
{ name = "greet" },
{ name = "haiku", tag = "latest" }
]
7. Invoke your Skill using the API
When your Skill is deployed, you can test it by making an API call to PhariaEngine. See the OpenAPI documentation at https://pharia-kernel.yourpharia.domain/api-docs to construct your request.
The following is an example using curl:
curl 'https://pharia-kernel.yourpharia.domain/v1/skills/{namespace}/{name}/run' \
--request POST \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer $PHARIA_AI_TOKEN" \
--data '{"topic": "Some text to be a haiku"}'