# Variables

Configure dynamic variables used in translation requests, including context memory for AI/LLM translations.

## Context memory

{% hint style="info" %}
**Note:** This is a manual variable configuration for **Custom MT.**\
**In VNTranslator versions >= v0.8.8**, context memory is automatically applied to all AI/LLM API services, so manual configuration is not required.
{% endhint %}

Maintains context history for AI/LLM translations, allowing the model to reference previous dialogue for improved coherence and accuracy. Customize the system prompt and control how much context is retained.

<figure><img src="https://4121582948-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FKz66WcKqTPRwdFrHi4mM%2Fuploads%2Fiu8SEXJwGhf7WcKaHhUn%2Fconversation-memory-settings.jpg?alt=media&#x26;token=a443326e-91af-4c74-b904-38a0573e64b9" alt=""><figcaption></figcaption></figure>

### Configuration

* **Import from JSON file** \
  Specifies the path to a JSON file containing context history to import.
* **Context Source**\
  Choose where context is retrieved from:
  * **Translation Memory** \
    Uses recent translations from Translation Memory as context
  * **New Translation**\
    Starts fresh conversation for each translation request
  * **JSON File** \
    Imports context history from the specified JSON file
* **Max Context Entries** \
  Sets the maximum number of context entries to store in memory. Older entries are removed when this limit is reached.

### API Variables

Use the following variable in your API requests to include context memory:

* `$MT::ConversationMemory.Entries::ToArray()`
* `$MT::ContextMemory.Entries::ToArray()`

This variable expands to an array of context entries that can be passed to AI/LLM APIs.

### Example: OpenAI GPT Integration

**Initial prompt:**

```
You will be provided with a sentence in Japanese,
and your task is to translate it into English accurately.
If there are any cultural references or nuances within the text,
kindly provide a brief explanation or context for those as well.
The text is as follows:
```

**System template:**

```json
[{"role": "system", "content": "$PROMPT"}]
```

**User template:**

```json
[{"role": "user", "content": "$ORIGINAL_TEXT"}]
```

**Assistant template:**

```json
[{"role": "assistant", "content": "$TRANSLATED_TEXT"}]
```

**Custom MT Configuration**

Complete configuration example for OpenAI GPT with context memory:

```json
{
  "version": "2",
  "service": "openai",
  "lang": {
    "source": [
      { "name": "Custom", "value": "Custom" },
    ],
    "target": [
      { "name": "Custom", "value": "Custom" },
    ]
  },
  "config": {
    "method": "post",
    "encodeURI": false,
    "encodeURIComponent": false,
    "postURL": "https://api.openai.com/v1/chat/completions",
    "postData": {
      "model": "gpt-3.5-turbo",	  
      "messages": "$MT::ConversationMemory.Entries::ToArray()",	  
      "temperature": 0.9,
      "max_tokens": 4096,
      "top_p": 1,
      "frequency_penalty": 0.7,
      "presence_penalty": 0.7
    },
    "postOptions": {
      "headers": {
        "Authorization": "Bearer [YOUR_API_KEY_HERE]",
        "Content-Type": "application/json"
      }
    },
    "responseParse": true,
    "responseType": "json",
    "responseQuery": "choices[0].message.content"
  }
}
```
