diff --git a/0.3/tutorials/giphy/index.html b/0.3/tutorials/giphy/index.html index 063ace6e..0051496d 100644 --- a/0.3/tutorials/giphy/index.html +++ b/0.3/tutorials/giphy/index.html @@ -59,7 +59,7 @@ └── pyproject.toml
  • To run LLM-based applications, you need an API key for the LLM used. The most commonly used LLM is OpenAI. To use it, create an OpenAI API Key and set it as an environment variable in the terminal using the following command:

    export OPENAI_API_KEY=openai_api_key_here
     

    If you want to use a different LLM provider, follow this guide.

    Alternatively, you can skip this step and set the LLM API key as an environment variable later in the devcontainer's terminal. If you open the project in Visual Studio Code using GUI, you will need to manually set the environment variable in the devcontainer's terminal.

  • Open the generated project in Visual Studio Code with the following command:

    code my_fastagency_app
    -

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
    +

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
     

    You should get the following output if everything is correctly setup.

    =================================== test session starts ===================================
     platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0
     rootdir: /workspaces/my_fastagency_app
    diff --git a/0.3/tutorials/mesop_template/index.html b/0.3/tutorials/mesop_template/index.html
    index 7ca50be4..81b25f70 100644
    --- a/0.3/tutorials/mesop_template/index.html
    +++ b/0.3/tutorials/mesop_template/index.html
    @@ -59,7 +59,7 @@
     └── pyproject.toml
     
  • To run LLM-based applications, you need an API key for the LLM used. The most commonly used LLM is OpenAI. To use it, create an OpenAI API Key and set it as an environment variable in the terminal using the following command:

    export OPENAI_API_KEY=openai_api_key_here
     

    If you want to use a different LLM provider, follow this guide.

    Alternatively, you can skip this step and set the LLM API key as an environment variable later in the devcontainer's terminal. If you open the project in Visual Studio Code using GUI, you will need to manually set the environment variable in the devcontainer's terminal.

  • Open the generated project in Visual Studio Code with the following command:

    code my_fastagency_app
    -

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
    +

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
     

    You should get the following output if everything is correctly setup.

    =================================== test session starts ===================================
     platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0
     rootdir: /workspaces/my_fastagency_app
    @@ -97,6 +97,6 @@
     [2024-10-10 13:19:18 +0530] [23635] [INFO] Listening at: http://127.0.0.1:8000 (23635)
     [2024-10-10 13:19:18 +0530] [23635] [INFO] Using worker: sync
     [2024-10-10 13:19:18 +0530] [23645] [INFO] Booting worker with pid: 23645
    -

    The command will launch a web interface where users can input their requests and interact with the agents (in this case http://localhost:8000)

    Note

    Ensure that your OpenAI API key is set in the environment, as the agents rely on it to interact using GPT-4o. If the API key is not correctly configured, the application may fail to retrieve LLM-powered responses.

    \ No newline at end of file diff --git a/0.3/tutorials/whatsapp/index.html b/0.3/tutorials/whatsapp/index.html index 54f448a4..ffc6b800 100644 --- a/0.3/tutorials/whatsapp/index.html +++ b/0.3/tutorials/whatsapp/index.html @@ -59,7 +59,7 @@ └── pyproject.toml
  • To run LLM-based applications, you need an API key for the LLM used. The most commonly used LLM is OpenAI. To use it, create an OpenAI API Key and set it as an environment variable in the terminal using the following command:

    export OPENAI_API_KEY=openai_api_key_here
     

    If you want to use a different LLM provider, follow this guide.

    Alternatively, you can skip this step and set the LLM API key as an environment variable later in the devcontainer's terminal. If you open the project in Visual Studio Code using GUI, you will need to manually set the environment variable in the devcontainer's terminal.

  • Open the generated project in Visual Studio Code with the following command:

    code my_fastagency_app
    -

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
    +

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
     

    You should get the following output if everything is correctly setup.

    =================================== test session starts ===================================
     platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0
     rootdir: /workspaces/my_fastagency_app
    diff --git a/latest/tutorials/giphy/index.html b/latest/tutorials/giphy/index.html
    index 063ace6e..0051496d 100644
    --- a/latest/tutorials/giphy/index.html
    +++ b/latest/tutorials/giphy/index.html
    @@ -59,7 +59,7 @@
     └── pyproject.toml
     
  • To run LLM-based applications, you need an API key for the LLM used. The most commonly used LLM is OpenAI. To use it, create an OpenAI API Key and set it as an environment variable in the terminal using the following command:

    export OPENAI_API_KEY=openai_api_key_here
     

    If you want to use a different LLM provider, follow this guide.

    Alternatively, you can skip this step and set the LLM API key as an environment variable later in the devcontainer's terminal. If you open the project in Visual Studio Code using GUI, you will need to manually set the environment variable in the devcontainer's terminal.

  • Open the generated project in Visual Studio Code with the following command:

    code my_fastagency_app
    -

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
    +

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
     

    You should get the following output if everything is correctly setup.

    =================================== test session starts ===================================
     platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0
     rootdir: /workspaces/my_fastagency_app
    diff --git a/latest/tutorials/mesop_template/index.html b/latest/tutorials/mesop_template/index.html
    index 7ca50be4..81b25f70 100644
    --- a/latest/tutorials/mesop_template/index.html
    +++ b/latest/tutorials/mesop_template/index.html
    @@ -59,7 +59,7 @@
     └── pyproject.toml
     
  • To run LLM-based applications, you need an API key for the LLM used. The most commonly used LLM is OpenAI. To use it, create an OpenAI API Key and set it as an environment variable in the terminal using the following command:

    export OPENAI_API_KEY=openai_api_key_here
     

    If you want to use a different LLM provider, follow this guide.

    Alternatively, you can skip this step and set the LLM API key as an environment variable later in the devcontainer's terminal. If you open the project in Visual Studio Code using GUI, you will need to manually set the environment variable in the devcontainer's terminal.

  • Open the generated project in Visual Studio Code with the following command:

    code my_fastagency_app
    -

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
    +

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
     

    You should get the following output if everything is correctly setup.

    =================================== test session starts ===================================
     platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0
     rootdir: /workspaces/my_fastagency_app
    @@ -97,6 +97,6 @@
     [2024-10-10 13:19:18 +0530] [23635] [INFO] Listening at: http://127.0.0.1:8000 (23635)
     [2024-10-10 13:19:18 +0530] [23635] [INFO] Using worker: sync
     [2024-10-10 13:19:18 +0530] [23645] [INFO] Booting worker with pid: 23645
    -

    The command will launch a web interface where users can input their requests and interact with the agents (in this case http://localhost:8000)

    Note

    Ensure that your OpenAI API key is set in the environment, as the agents rely on it to interact using GPT-4o. If the API key is not correctly configured, the application may fail to retrieve LLM-powered responses.

    \ No newline at end of file diff --git a/latest/tutorials/whatsapp/index.html b/latest/tutorials/whatsapp/index.html index 54f448a4..ffc6b800 100644 --- a/latest/tutorials/whatsapp/index.html +++ b/latest/tutorials/whatsapp/index.html @@ -59,7 +59,7 @@ └── pyproject.toml
  • To run LLM-based applications, you need an API key for the LLM used. The most commonly used LLM is OpenAI. To use it, create an OpenAI API Key and set it as an environment variable in the terminal using the following command:

    export OPENAI_API_KEY=openai_api_key_here
     

    If you want to use a different LLM provider, follow this guide.

    Alternatively, you can skip this step and set the LLM API key as an environment variable later in the devcontainer's terminal. If you open the project in Visual Studio Code using GUI, you will need to manually set the environment variable in the devcontainer's terminal.

  • Open the generated project in Visual Studio Code with the following command:

    code my_fastagency_app
    -

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
    +

  • Once the project is opened, you will get the following option to reopen it in a devcontainer:

  • After reopening the project in devcontainer, you can verify that the setup is correct by running the provided tests with the following command:

    pytest -s
     

    You should get the following output if everything is correctly setup.

    =================================== test session starts ===================================
     platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0
     rootdir: /workspaces/my_fastagency_app