The Middleware Service provides content generation and embedding vector APIs. It connects to OpenAI services for processing user queries and exposes these APIs via Azure API Management (APIM) in production.
- Java 17 or higher
- Maven
- Access to Azure OpenAI and Azure Key Vault
- Backend service running locally on port 8080
-
Clone the repository:
git clone https://github.com/Azure-Samples/Java-AI-Based-Content-Generator cd Java-AI-Based-Content-Generator/middleware -
Configure the Application Properties: To avoid port conflicts, ensure the
server.portis set to8081in theapplication.propertiesfile:server.port=8081 -
Key Vault Setup:
Before adding or accessing secrets in Key Vault, you need to follow the setup instructions in the Azure Key Vault Setup Guide, which include:
- Assigning the necessary Key Vault Administrator and Key Vault Reader roles.
- Adding the required secrets for the middleware service.
-
Environment Variables for Local Development:
- Set the required Azure environment variables to access secrets from Key Vault:
export AZURE_KEYVAULT_URI=<your_keyvault_url>
-
Run the Application:
- Dependency summary
- Start the backend service using the following command:
- Install dependencies:
./mvnw clean install
- Run the application:
./mvnw spring-boot:run
- Install dependencies:
The service will now be running at
http://localhost:8081.
The middleware service requires the following secrets in Azure Key Vault:
BackendServiceBaseUrl:http://localhost:8080(for local), or backend APIM URL (for deployment)BackendServiceProductEndpoint:/api/v1/product(for local), or backend APIM endpointBackendServiceSimilarProductEndpoint:/api/v1/product/similar(for local), or backend APIM endpointAzureOpenAiEndpointUrl(OpenAI GPT-4o completion model Target URI)AzureOpenAiAccessKey(OpenAI GPT-4o Key)AzureOpenAiEmbeddingEndpointUrl(OpenAI text-embedding-3-small model Target URI)AzureOpenAiEmbeddingKey(OpenAI embedding model Key)
- For Azure App Service deployment, see app_service.md.
- For AKS deployment, see aks.md.