meta-llama / llama-stack

Composable building blocks to build Llama Apps
MIT License
4.55k stars 571 forks source link

Cannot use LlamaGuardShield since not present in config #102

Closed matbee-eth closed 1 month ago

matbee-eth commented 1 month ago

Trying to use Llama3.2-11B-Vision-Instruct with no PromptGuard / LlamaGuard, along with llama-stack-apps/app/main.py. Perhaps I can disable Safety with the Agents logic somehow?

Getting this error in my terminal:

INFO:     172.17.0.1:39656 - "POST /agents/create HTTP/1.1" 200 OK
INFO:     172.17.0.1:39656 - "POST /agents/session/create HTTP/1.1" 200 OK
INFO:     172.17.0.1:39656 - "POST /agents/turn/create HTTP/1.1" 200 OK
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 231, in sse_generator
    async for item in event_gen:
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/agents/agents.py", line 127, in create_agent_turn
    async for event in agent.create_and_execute_turn(request):
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/agents/agent_instance.py", line 174, in create_and_execute_turn
    async for chunk in self.run(
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/agents/agent_instance.py", line 239, in run
    async for res in self.run_multiple_shields_wrapper(
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/agents/agent_instance.py", line 294, in run_multiple_shields_wrapper
    await self.run_multiple_shields(messages, shields)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/agents/safety.py", line 37, in run_multiple_shields
    responses = await asyncio.gather(
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routers.py", line 168, in run_shield
    return await self.routing_table.get_provider_impl(shield_type).run_shield(
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/safety/safety.py", line 62, in run_shield
    shield = self.get_shield_impl(MetaReferenceShieldType(shield_type))
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/safety/safety.py", line 93, in get_shield_impl
    cfg is not None
AssertionError: Cannot use LlamaGuardShield since not present in config
built_at: '2024-09-25T18:42:07.306715'
image_name: local-gpu
docker_image: local-gpu
conda_env: null
apis_to_serve:
- agents
- memory
- models
- safety
- memory_banks
- shields
- inference
api_providers:
  inference:
    providers:
    - meta-reference
  safety:
    providers:
    - meta-reference
  agents:
    provider_id: meta-reference
    config:
      persistence_store:
        namespace: null
        type: sqlite
        db_path: /root/.llama/runtime/kvstore.db
  memory:
    providers:
    - meta-reference
  telemetry:
    provider_id: meta-reference
    config: {}
routing_table:
  inference:
  - provider_id: meta-reference
    config:
      model: Llama3.2-11B-Vision-Instruct
      quantization: null
      torch_seed: null
      max_seq_len: 4096
      max_batch_size: 1
    routing_key: Llama3.2-11B-Vision-Instruct
  safety:
  - provider_id: meta-reference
    config:
      llama_guard_shield: null
      prompt_guard_shield: null
    routing_key: llama_guard
  - provider_id: meta-reference
    config:
      llama_guard_shield: null
      prompt_guard_shield: null
    routing_key: code_scanner_guard
  - provider_id: meta-reference
    config:
      llama_guard_shield: null
      prompt_guard_shield: null
    routing_key: injection_shield
  - provider_id: meta-reference
    config:
      llama_guard_shield: null
      prompt_guard_shield: null
    routing_key: jailbreak_shield
  memory:
  - provider_id: meta-reference
    config: {}
    routing_key: vector
matbee-eth commented 1 month ago

My mistake- it's hardcoded in the agent example in the apps client_utils