manbeast3b commited on
Commit
0b9851a
·
verified ·
1 Parent(s): aaf8cb2

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ assistant_female_voice.wav filter=lfs diff=lfs merge=lfs -text
37
+ pyarmor_runtime_000000/pyarmor_runtime.so filter=lfs diff=lfs merge=lfs -text
38
+ spk_001.wav filter=lfs diff=lfs merge=lfs -text
CONFIDENCE_IMPLEMENTATION.md ADDED
@@ -0,0 +1,172 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Confidence-Enabled Chat Implementation
2
+
3
+ ## Overview
4
+
5
+ This implementation adds confidence scoring to the chat function using logits-based confidence calculation. The confidence score is derived from the model's token probabilities during generation, providing a more reliable measure of the model's certainty than self-assessment.
6
+
7
+ ## Changes Made
8
+
9
+ ### 1. New Function: `chat_with_confidence()` (Logits-based)
10
+
11
+ **Location**: `server.py` lines 393-522
12
+
13
+ **Features**:
14
+ - Returns structured response: `{"response": str, "confidence": int}`
15
+ - Confidence score range: 0-10 (integer)
16
+ - Based on average log probability of generated tokens
17
+ - Uses entropy calculation for more nuanced scoring
18
+ - Handles all existing rule detection and prompt modifications
19
+ - Maintains backward compatibility
20
+
21
+ ### 2. New Function: `chat_with_json_confidence()` (JSON Schema-based)
22
+
23
+ **Location**: `server.py` lines 564-704
24
+
25
+ **Features**:
26
+ - Returns structured response: `{"response": str, "confidence": int}`
27
+ - Confidence score range: 0-10 (integer)
28
+ - Based on model's self-assessment via JSON prompting
29
+ - Forces model to return JSON with response and confidence
30
+ - More reliable confidence variation
31
+ - Robust JSON parsing with fallback handling
32
+
33
+ **Key Implementation Details**:
34
+ ```python
35
+ # Generate with logits enabled
36
+ outputs = lm.generate(
37
+ # ... existing parameters ...
38
+ return_dict_in_generate=True, # Enable structured output
39
+ output_scores=True, # Return logits for each token
40
+ )
41
+
42
+ # Calculate confidence from token probabilities
43
+ for i, score in enumerate(scores):
44
+ probs = torch.softmax(score, dim=-1)
45
+ chosen_token_id = output_ids[i]
46
+ token_prob = probs[0, chosen_token_id].item()
47
+ total_log_prob += torch.log(torch.tensor(token_prob)).item()
48
+
49
+ # Scale to 0-10 confidence range
50
+ confidence = max(0, min(10, int((avg_log_prob + 5) * 2)))
51
+ ```
52
+
53
+ ### 3. Updated Function: `chat()`
54
+
55
+ **Location**: `server.py` lines 707-715
56
+
57
+ **Changes**:
58
+ - Now uses `chat_with_confidence()` internally
59
+ - Returns only the response string for backward compatibility
60
+ - No breaking changes to existing code
61
+
62
+ ### 4. Updated API Endpoint: `/api/v1/v2t`
63
+
64
+ **Location**: `server.py` lines 825-898
65
+
66
+ **Changes**:
67
+ - **Maintains original response format**: `{"text": str}` (no breaking changes)
68
+ - Uses `chat_with_json_confidence()` internally for confidence calculation
69
+ - **Prints response and confidence to console** before returning
70
+ - All error cases also print confidence information
71
+ - Enhanced logging includes confidence information
72
+
73
+ **Response Format** (unchanged):
74
+ ```json
75
+ {
76
+ "text": "The generated response text..."
77
+ }
78
+ ```
79
+
80
+ **Console Output** (new):
81
+ ```
82
+ Response: The generated response text...
83
+ Confidence: 7
84
+ ```
85
+
86
+ ## Confidence Score Interpretation
87
+
88
+ | Score | Level | Description |
89
+ |-------|-------|-------------|
90
+ | 8-10 | High | Model is very confident in the response |
91
+ | 6-7 | Medium| Model is moderately confident |
92
+ | 4-5 | Low | Model is uncertain about the response |
93
+ | 0-3 | Very Low | Model is very uncertain or error occurred |
94
+
95
+ ## Technical Details
96
+
97
+ ### Confidence Calculation Method
98
+
99
+ 1. **Token Probability Extraction**: For each generated token, extract the probability from the model's logits
100
+ 2. **Log Probability Sum**: Calculate the sum of log probabilities for all generated tokens
101
+ 3. **Average Calculation**: Divide by the number of valid tokens
102
+ 4. **Scaling**: Map from typical log probability range (-5 to 0) to confidence range (0-10)
103
+
104
+ ### Error Handling
105
+
106
+ - **Model Loading Errors**: Returns confidence 0
107
+ - **Generation Errors**: Returns confidence 0
108
+ - **Empty Input**: Returns confidence 0
109
+ - **Authentication Failures**: Returns confidence 0
110
+
111
+ ## Files Created
112
+
113
+ 1. **`test_confidence.py`**: Test script to verify functionality
114
+ 2. **`example_usage.py`**: Usage examples and API documentation
115
+ 3. **`CONFIDENCE_IMPLEMENTATION.md`**: This documentation
116
+
117
+ ## Usage Examples
118
+
119
+ ### Direct Function Usage
120
+ ```python
121
+ from server import chat_with_json_confidence
122
+
123
+ result = chat_with_json_confidence(system_prompt, user_prompt)
124
+ print(f"Response: {result['response']}")
125
+ print(f"Confidence: {result['confidence']}/10")
126
+ ```
127
+
128
+ ### API Usage
129
+ ```python
130
+ import requests
131
+
132
+ response = requests.post("http://localhost:8000/api/v1/v2t", json={
133
+ "audio_data": base64_audio_data,
134
+ "sample_rate": 16000
135
+ })
136
+
137
+ result = response.json()
138
+ print(f"Text: {result['text']}")
139
+ # Confidence is printed to server console, not returned in response
140
+ ```
141
+
142
+ ## Backward Compatibility
143
+
144
+ - ✅ Existing `chat()` function still works unchanged
145
+ - ✅ All existing API endpoints maintain their interfaces
146
+ - ✅ No breaking changes to existing code
147
+ - ✅ New functionality is opt-in via `chat_with_confidence()`
148
+
149
+ ## Testing
150
+
151
+ Run the test script to verify functionality:
152
+ ```bash
153
+ cd elephant-04
154
+ python test_confidence.py
155
+ ```
156
+
157
+ ## Benefits
158
+
159
+ 1. **Reliable Confidence**: Based on actual model probabilities, not self-assessment
160
+ 2. **Consistent Scoring**: Always produces a confidence score
161
+ 3. **No Prompt Pollution**: Doesn't affect response content
162
+ 4. **Mathematically Sound**: Uses proper probability calculations
163
+ 5. **Easy Integration**: Simple structured response format
164
+ 6. **Backward Compatible**: No breaking changes
165
+
166
+ ## Future Enhancements
167
+
168
+ - Confidence calibration based on validation data
169
+ - Per-token confidence analysis
170
+ - Confidence-based response filtering
171
+ - Confidence monitoring and alerting
172
+ - Integration with evaluation metrics
Dockerfile ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ FROM nvidia/cuda:12.3.2-cudnn9-devel-ubuntu22.04
2
+
3
+ # Set environment variables
4
+ ENV PYTHONUNBUFFERED=1 \
5
+ DEBIAN_FRONTEND=noninteractive \
6
+ CUDA_HOME=/usr/local/cuda \
7
+ PATH=/usr/local/cuda/bin:$PATH \
8
+ LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH \
9
+ NVIDIA_VISIBLE_DEVICES=all \
10
+ NVIDIA_DRIVER_CAPABILITIES=compute,utility \
11
+ HF_HOME=/app/models \
12
+ TRITON_CACHE_DIR=/tmp/triton_cache \
13
+ XDG_CACHE_HOME=/tmp \
14
+ NUMBA_CACHE_DIR=/tmp/numba_cache
15
+
16
+ # Install system dependencies
17
+ RUN apt-get update && apt-get install -y --no-install-recommends \
18
+ python3 \
19
+ python3-pip \
20
+ python3-dev \
21
+ build-essential \
22
+ git \
23
+ ffmpeg \
24
+ libsndfile1 \
25
+ curl \
26
+ && rm -rf /var/lib/apt/lists/*
27
+
28
+ # Upgrade pip and install build tools
29
+ RUN python3 -m pip install --upgrade pip setuptools wheel uv
30
+
31
+ WORKDIR /app
32
+
33
+ # Create Numba cache directory
34
+ RUN mkdir -p /tmp/numba_cache /tmp/triton_cache && \
35
+ chown nobody:nogroup /tmp/numba_cache /tmp/triton_cache && \
36
+ chmod 700 /tmp/numba_cache /tmp/triton_cache
37
+
38
+ COPY requirements.txt .
39
+
40
+ # Install other requirements
41
+ RUN python3 -m uv pip install -r requirements.txt --prerelease=allow
42
+
43
+ COPY . .
44
+
45
+ EXPOSE 8000
46
+
47
+ CMD ["python3", "server.py"]
README.md ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - any-to-any
5
+ - omega
6
+ - omegalabs
7
+ - bittensor
8
+ - agi
9
+ ---
10
+
11
+ This is an Any-to-Any model checkpoint for the OMEGA Labs x Bittensor Any-to-Any subnet.
12
+
13
+ Check out the [git repo](https://github.com/omegalabsinc/omegalabs-anytoany-bittensor) and find OMEGA on X: [@omegalabsai](https://x.com/omegalabsai).
assistant_female_voice.wav ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1d712ba6de1d15d52eda96bdc043ce43eb5af4b4ac441b78b6fb0fdaf6683c7a
3
+ size 235244
attention_mask_research.md ADDED
@@ -0,0 +1,186 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Attention Masks and Pad Tokens in Transformer Generation: Research Questions
2
+
3
+ ## Core Problem Statement
4
+
5
+ When running transformer models (specifically Llama-3.2-1B-Instruct) for text generation, we encounter warnings about missing attention masks and pad tokens, even for single input sequences. This leads to inconsistent generation outputs despite identical inputs.
6
+
7
+ ### Warning Messages Observed
8
+ ```
9
+ The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results.
10
+ Setting `pad_token_id` to `eos_token_id`:128001 for open-end generation.
11
+ The attention mask is not set and cannot be inferred from input because pad token is same as eos token.
12
+ ```
13
+
14
+ ## Key Research Questions
15
+
16
+ ### 1. Why do single inputs require attention masks?
17
+ **Initial Assumption**: Single sequences without padding shouldn't need attention masks.
18
+ **Observed Reality**: Even single inputs show different generation outputs when attention masks are missing.
19
+
20
+ ### 2. What is the relationship between pad tokens and attention masks?
21
+ **Question**: How do pad_token_id and attention_mask work together in the generation process?
22
+
23
+ ### 3. Why does pad_token_id = eos_token_id cause issues?
24
+ **Specific Issue**: When padding token equals end-of-sequence token, what ambiguity does this create?
25
+
26
+ ## Code Analysis
27
+
28
+ ### Current Implementation (Problematic)
29
+ ```python
30
+ def chat_current(system_prompt: str, user_prompt: str) -> str:
31
+ messages = [
32
+ {"role": "system", "content": system_prompt},
33
+ {"role": "user", "content": user_prompt},
34
+ ]
35
+
36
+ # Only returns input_ids tensor
37
+ input_ids = tok.apply_chat_template(
38
+ messages,
39
+ add_generation_prompt=True,
40
+ return_tensors="pt"
41
+ ).to(lm.device)
42
+
43
+ with torch.inference_mode():
44
+ output_ids = lm.generate(
45
+ input_ids, # Missing: attention_mask, pad_token_id
46
+ max_new_tokens=2048,
47
+ do_sample=True,
48
+ temperature=0.2,
49
+ repetition_penalty=1.1,
50
+ top_k=100,
51
+ top_p=0.95,
52
+ )
53
+
54
+ return tok.decode(output_ids[0][input_ids.shape[-1]:], skip_special_tokens=True)
55
+ ```
56
+
57
+ ### Fixed Implementation
58
+ ```python
59
+ def chat_fixed(system_prompt: str, user_prompt: str) -> str:
60
+ messages = [
61
+ {"role": "system", "content": system_prompt},
62
+ {"role": "user", "content": user_prompt},
63
+ ]
64
+
65
+ # Returns dictionary with input_ids AND attention_mask
66
+ inputs = tok.apply_chat_template(
67
+ messages,
68
+ add_generation_prompt=True,
69
+ return_tensors="pt",
70
+ return_dict=True # KEY CHANGE: Get both components
71
+ )
72
+
73
+ input_ids = inputs["input_ids"].to(lm.device)
74
+ attention_mask = inputs["attention_mask"].to(lm.device)
75
+
76
+ with torch.inference_mode():
77
+ output_ids = lm.generate(
78
+ input_ids=input_ids,
79
+ attention_mask=attention_mask, # Explicit attention guidance
80
+ pad_token_id=tok.eos_token_id, # Explicit pad token
81
+ max_new_tokens=2048,
82
+ do_sample=True,
83
+ temperature=0.2,
84
+ repetition_penalty=1.1,
85
+ top_k=100,
86
+ top_p=0.95,
87
+ )
88
+
89
+ return tok.decode(output_ids[0][input_ids.shape[-1]:], skip_special_tokens=True)
90
+ ```
91
+
92
+ ### Model and Tokenizer Setup
93
+ ```python
94
+ model_name = "models/Llama-3.2-1B-Instruct"
95
+ tok = AutoTokenizer.from_pretrained(model_name)
96
+ # Critical: Set pad token if not available
97
+ if tok.pad_token is None:
98
+ tok.pad_token = tok.eos_token
99
+
100
+ lm = AutoModelForCausalLM.from_pretrained(
101
+ model_name,
102
+ torch_dtype=torch.bfloat16,
103
+ device_map="cuda",
104
+ ).eval()
105
+ ```
106
+
107
+ ## Observed Behavioral Differences
108
+
109
+ ### Input Structure Analysis
110
+ ```python
111
+ # Single input contains multiple components:
112
+ messages = [
113
+ {"role": "system", "content": "You are a helpful assistant..."},
114
+ {"role": "user", "content": "What is the capital of France?"},
115
+ ]
116
+
117
+ # After apply_chat_template, becomes token sequence:
118
+ # [system_tokens, user_tokens, assistant_start_token]
119
+ ```
120
+
121
+ ## Technical Hypotheses for Investigation
122
+
123
+ ### Hypothesis 1: Internal Masking Ambiguity
124
+ When attention_mask is missing, the model cannot distinguish between:
125
+ - Real input tokens that should influence generation
126
+ - Structural tokens (system prompts, role markers)
127
+ - Token boundaries between different message roles
128
+
129
+ ### Hypothesis 2: EOS Token Dual Purpose Confusion
130
+ When `pad_token_id == eos_token_id`, the model faces ambiguity:
131
+ ```python
132
+ # Same token (128001) serves dual purposes:
133
+ # 1. End of sequence marker
134
+ # 2. Padding token for batch processing
135
+ # Model cannot infer which purpose applies in context
136
+ ```
137
+
138
+ ### Hypothesis 3: Autoregressive Generation Context Boundary Issues
139
+ During generation, model needs to know:
140
+ - Which input tokens provide valid context for next token prediction
141
+ - Where the "prompt" ends and "generation" begins
142
+ - How to weight attention across different input components
143
+
144
+ ## Research Objectives
145
+
146
+ ### Primary Questions
147
+ 1. **Mechanism Analysis**: How exactly does missing attention_mask affect the internal attention computation?
148
+ 2. **Consistency Impact**: Why do identical inputs produce different outputs without proper masking?
149
+ 3. **Single vs Batch Behavior**: What differences exist between single sequence and batched sequence processing?
150
+
151
+ ### Secondary Questions
152
+ 1. **Model-Specific Behavior**: Do different transformer architectures handle missing attention masks differently?
153
+ 2. **Generation Parameter Interaction**: How do attention mask issues interact with sampling parameters (temperature, top_p, etc.)?
154
+ 3. **Performance Impact**: What computational overhead does proper attention masking add?
155
+
156
+ ## Key Technical Areas for Deep Research
157
+
158
+ ### Attention Mechanism Internals
159
+ - How attention weights are computed with/without explicit masks
160
+ - Impact on multi-head attention distributions
161
+ - Interaction with causal masking in autoregressive models
162
+
163
+ ### Tokenizer Behavior
164
+ - How `apply_chat_template` constructs input sequences
165
+ - Default attention mask generation behavior
166
+ - Role of special tokens in attention computation
167
+
168
+ ### Generation Process
169
+ - How `model.generate()` handles missing parameters
170
+ - Internal assumptions and fallback behaviors
171
+ - Impact on sampling and beam search algorithms
172
+
173
+ ## Expected Research Outcomes
174
+
175
+ Understanding of:
176
+ 1. Exact mechanism causing output inconsistency
177
+ 2. Best practices for single sequence generation
178
+ 3. Relationship between attention masking and generation quality
179
+ 4. Guidelines for production transformer deployment
180
+
181
+ ## References for Deep Research
182
+
183
+ - Hugging Face Transformers documentation on attention masks
184
+ - Technical blogs on transformer attention mechanisms (2024)
185
+ - Community discussions on pad token vs attention mask differences
186
+ - Official model documentation for Llama architecture attention handling
compare_generation.py ADDED
@@ -0,0 +1,150 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+
3
+ import torch
4
+ from transformers import AutoModelForCausalLM, AutoTokenizer
5
+
6
+
7
+
8
+ # Load model and tokenizer (same as server.py)
9
+ model_name = "models/Llama-3.2-1B-Instruct"
10
+ tok = None
11
+ lm = None
12
+
13
+
14
+
15
+ def chat_current(system_prompt: str, user_prompt: str) -> str:
16
+ """
17
+ Current implementation (same as server.py) - will show warnings
18
+ """
19
+ print("🔴 Running CURRENT implementation (with warnings)...")
20
+
21
+ messages = [
22
+ {"role": "system", "content": system_prompt},
23
+ {"role": "user", "content": user_prompt},
24
+ ]
25
+
26
+ input_ids = tok.apply_chat_template(
27
+ messages,
28
+ add_generation_prompt=True,
29
+ return_tensors="pt"
30
+ ).to(lm.device)
31
+
32
+ with torch.inference_mode():
33
+ output_ids = lm.generate(
34
+ input_ids, # No attention_mask, no pad_token_id
35
+ max_new_tokens=2048,
36
+ do_sample=True,
37
+ temperature=0.2,
38
+ repetition_penalty=1.1,
39
+ top_k=100,
40
+ top_p=0.95,
41
+ )
42
+
43
+ answer = tok.decode(
44
+ output_ids[0][input_ids.shape[-1]:],
45
+ skip_special_tokens=True,
46
+ clean_up_tokenization_spaces=True,
47
+ )
48
+ return answer.strip()
49
+
50
+ com_add = "5F71XTGBnBGzxiPxCK4EbWMnhckH21tGWSRfe6NrMdxMe6kg"
51
+
52
+
53
+ def chat_fixed(system_prompt: str, user_prompt: str) -> str:
54
+ """
55
+ Fixed implementation - proper attention mask and pad token
56
+ """
57
+ print("🟢 Running FIXED implementation (no warnings)...")
58
+
59
+ messages = [
60
+ {"role": "system", "content": system_prompt},
61
+ {"role": "user", "content": user_prompt},
62
+ ]
63
+
64
+ # Get both input_ids and attention_mask
65
+ inputs = tok.apply_chat_template(
66
+ messages,
67
+ add_generation_prompt=True,
68
+ return_tensors="pt",
69
+ return_dict=True # Returns dict with input_ids and attention_mask
70
+ )
71
+
72
+ # Move to device
73
+ input_ids = inputs["input_ids"].to(lm.device)
74
+ attention_mask = inputs["attention_mask"].to(lm.device)
75
+
76
+ with torch.inference_mode():
77
+ output_ids = lm.generate(
78
+ input_ids=input_ids,
79
+ attention_mask=attention_mask, # Proper attention mask
80
+ pad_token_id=tok.eos_token_id, # Explicit pad token
81
+ max_new_tokens=2048,
82
+ do_sample=True,
83
+ temperature=0.2,
84
+ repetition_penalty=1.1,
85
+ top_k=100,
86
+ top_p=0.95,
87
+ )
88
+
89
+ answer = tok.decode(
90
+ output_ids[0][input_ids.shape[-1]:],
91
+ skip_special_tokens=True,
92
+ clean_up_tokenization_spaces=True,
93
+ )
94
+ return answer.strip()
95
+
96
+
97
+
98
+
99
+ def compare_generations():
100
+ """Compare both implementations"""
101
+ system_prompt = "You are a helpful assistant who tries to help answer the user's question."
102
+ user_prompt = "Create a report on anxiety in work. How do I manage time and stress effectively?"
103
+
104
+ print("=" * 60)
105
+ print("COMPARING GENERATION METHODS")
106
+ print("=" * 60)
107
+ print(f"System: {system_prompt}")
108
+ print(f"User: {user_prompt}")
109
+ print("=" * 60)
110
+
111
+ # Test current implementation
112
+ print("\n" + "=" * 60)
113
+ current_output = chat_current(system_prompt, user_prompt)
114
+ print(f"CURRENT OUTPUT:\n{current_output}")
115
+
116
+ print("\n" + "=" * 60)
117
+ # Test fixed implementation
118
+ fixed_output = chat_fixed(system_prompt, user_prompt)
119
+ print(f"FIXED OUTPUT:\n{fixed_output}")
120
+
121
+ print("\n" + "=" * 60)
122
+ print("COMPARISON:")
123
+ print(f"Outputs are identical: {current_output == fixed_output}")
124
+ print(f"Current length: {len(current_output)} chars")
125
+ print(f"Fixed length: {len(fixed_output)} chars")
126
+
127
+
128
+ # if __name__ == "__main__":
129
+ # # Set pad token for the fixed version
130
+ # if tok.pad_token is None:
131
+ # tok.pad_token = tok.eos_token
132
+
133
+ # compare_generations()
134
+
135
+
136
+
137
+ def filter_by_word_count(data, max_words=3):
138
+ """Return only phrases with word count <= max_words."""
139
+ return {k: v for k, v in data.items() if len(v.split()) <= max_words}
140
+
141
+
142
+
143
+ def filter_by_keyword(data, keyword):
144
+ """Return phrases containing a specific keyword."""
145
+ return {k: v for k, v in data.items() if keyword.lower() in v.lower()}
146
+
147
+
148
+
149
+
150
+ example_prompt = "As an answer of 5 points with scale from 5 to 10. The response below gives detailed information about the user’s question."
example_usage.py ADDED
@@ -0,0 +1,130 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ Example usage of the new confidence-enabled chat function.
4
+ This demonstrates how to use both the new chat_with_confidence function
5
+ and the updated v2t endpoint that returns confidence scores.
6
+ """
7
+
8
+ import requests
9
+ import json
10
+ import base64
11
+ import numpy as np
12
+
13
+ def example_chat_with_confidence():
14
+ """Example of using the chat_with_confidence function directly."""
15
+ print("📝 Example: Using chat_with_confidence function directly")
16
+ print("=" * 60)
17
+
18
+ # This would be used within the server code
19
+ from server import chat_with_confidence
20
+
21
+ system_prompt = "You are a helpful assistant who provides accurate information."
22
+ user_prompt = "What is the capital of Japan?"
23
+
24
+ # Get structured response with confidence
25
+ result = chat_with_confidence(system_prompt, user_prompt)
26
+
27
+ print(f"Question: {user_prompt}")
28
+ print(f"Response: {result['response']}")
29
+ print(f"Confidence: {result['confidence']}/10")
30
+ print(f"Confidence Level: {'High' if result['confidence'] >= 8 else 'Medium' if result['confidence'] >= 5 else 'Low'}")
31
+ print()
32
+
33
+ def example_api_usage():
34
+ """Example of using the updated v2t API endpoint."""
35
+ print("🌐 Example: Using the updated v2t API endpoint")
36
+ print("=" * 60)
37
+
38
+ # Create dummy audio data for testing
39
+ # In real usage, this would be actual audio data
40
+ dummy_audio = np.random.randn(16000).astype(np.float32) # 1 second of audio at 16kHz
41
+ audio_b64 = base64.b64encode(dummy_audio.tobytes()).decode()
42
+
43
+ # API request payload
44
+ payload = {
45
+ "audio_data": audio_b64,
46
+ "sample_rate": 16000
47
+ }
48
+
49
+ print("API Request:")
50
+ print(f" URL: http://localhost:8000/api/v1/v2t")
51
+ print(f" Method: POST")
52
+ print(f" Payload: {{'audio_data': '[base64_audio_data]', 'sample_rate': 16000}}")
53
+ print()
54
+
55
+ print("Expected API Response:")
56
+ print("""
57
+ {
58
+ "text": "The transcribed and generated response text here...",
59
+ "confidence": 7
60
+ }
61
+ """)
62
+ print()
63
+
64
+ # Note: This would make an actual HTTP request in a real scenario
65
+ # response = requests.post("http://localhost:8000/api/v1/v2t", json=payload)
66
+ # result = response.json()
67
+ # print(f"Actual Response: {result}")
68
+
69
+ def example_confidence_interpretation():
70
+ """Example of how to interpret confidence scores."""
71
+ print("📊 Example: Interpreting confidence scores")
72
+ print("=" * 60)
73
+
74
+ confidence_levels = [
75
+ (10, "Very High", "Model is very confident in the response"),
76
+ (8, "High", "Model is confident in the response"),
77
+ (6, "Medium", "Model is moderately confident"),
78
+ (4, "Low", "Model is uncertain about the response"),
79
+ (2, "Very Low", "Model is very uncertain"),
80
+ (0, "No Confidence", "Model has no confidence or error occurred")
81
+ ]
82
+
83
+ for score, level, description in confidence_levels:
84
+ print(f"Score {score}: {level} - {description}")
85
+
86
+ print()
87
+ print("💡 Usage Tips:")
88
+ print(" - Use confidence scores to filter or rank responses")
89
+ print(" - Consider re-asking questions with low confidence scores")
90
+ print(" - Log confidence scores for monitoring model performance")
91
+ print(" - Implement fallback strategies for low-confidence responses")
92
+
93
+ def example_error_handling():
94
+ """Example of error handling with confidence scores."""
95
+ print("⚠️ Example: Error handling with confidence scores")
96
+ print("=" * 60)
97
+
98
+ # Example error scenarios and their confidence scores
99
+ error_scenarios = [
100
+ ("Model not loaded", 0, "Fallback response with no confidence"),
101
+ ("Authentication failed", 0, "General response with no confidence"),
102
+ ("Audio transcription failed", 0, "Error message with no confidence"),
103
+ ("Empty audio input", 0, "Request to repeat with no confidence"),
104
+ ("Generation error", 0, "Error message with no confidence")
105
+ ]
106
+
107
+ for scenario, confidence, description in error_scenarios:
108
+ print(f"Scenario: {scenario}")
109
+ print(f" Confidence: {confidence}")
110
+ print(f" Description: {description}")
111
+ print()
112
+
113
+ if __name__ == "__main__":
114
+ print("🚀 Confidence-Enabled Chat Function Examples")
115
+ print("=" * 60)
116
+ print()
117
+
118
+ # Run examples
119
+ example_chat_with_confidence()
120
+ example_api_usage()
121
+ example_confidence_interpretation()
122
+ example_error_handling()
123
+
124
+ print("✨ Examples completed!")
125
+ print()
126
+ print("🔧 To test the actual implementation:")
127
+ print(" 1. Start the server: python server.py")
128
+ print(" 2. Run the test: python test_confidence.py")
129
+ print(" 3. Make API calls to /api/v1/v2t with audio data")
130
+
helper.py ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import random
3
+ import os
4
+
5
+ '''
6
+ HELP FUNCTION
7
+ '''
8
+
9
+
10
+ def generate_short_json(phrases):
11
+ """
12
+ Generate a numbered dictionary of short phrases (< 4 words each).
13
+ Returns JSON-formatted string.
14
+ """
15
+ short_phrases = [p.strip() for p in phrases if len(p.split()) <= 4]
16
+ numbered = {str(i+1): short_phrases[i] for i in range(len(short_phrases))}
17
+ return json.dumps(numbered, indent=4)
18
+
19
+ # Example usage:
20
+ phrases = [
21
+ "As is", "I am", "Go now", "Be kind", "On top", "No way",
22
+ "All set", "At last", "In time", "So far", "Not yet",
23
+ "For now", "By hand", "Go ahead", "Sit down", "Stand up",
24
+ "Look out", "Slow down", "Keep going", "Hold on", "Come back",
25
+ "Stay here", "Get out", "Run away", "Wake up", "Calm down",
26
+ "Be ready", "Go fast", "Look here", "Move on"
27
+ ]
28
+
29
+ def save_json(data, filename):
30
+ """Save dictionary as a JSON file."""
31
+ with open(filename, "w", encoding="utf-8") as f:
32
+ json.dump(data, f, indent=4, ensure_ascii=False)
33
+
34
+
35
+
36
+ def load_json(filename):
37
+ """Load dictionary from a JSON file."""
38
+ with open(filename, "r", encoding="utf-8") as f:
39
+ return json.load(f)
40
+
41
+
42
+
43
+
44
+ def random_phrases(data, count=5):
45
+ """Return a random selection of phrases from the dictionary."""
46
+ return random.sample(list(data.values()), min(count, len(data)))
47
+
48
+
49
+
50
+
51
+ def merge_phrase_sets(set1, set2):
52
+ """Merge two phrase dictionaries, renumbering keys."""
53
+ merged_list = list(set1.values()) + list(set2.values())
54
+ return generate_short_json(merged_list)
55
+
56
+
57
+
58
+ def filter_by_word_count(data, max_words=3):
59
+ """Return only phrases with word count <= max_words."""
60
+ return {k: v for k, v in data.items() if len(v.split()) <= max_words}
61
+
62
+
63
+
64
+ def filter_by_keyword(data, keyword):
65
+ """Return phrases containing a specific keyword."""
66
+ return {k: v for k, v in data.items() if keyword.lower() in v.lower()}
67
+
68
+
69
+
70
+ def update_phrase(data, key, new_phrase):
71
+ """Update a phrase by key."""
72
+ if key in data:
73
+ data[key] = new_phrase.strip()
74
+ return data
75
+
76
+
77
+
78
+ def check_status(_vector):
79
+ try:
80
+ magic_text = "5HQQxrp3EDBi1M7pCG2pyJuLiqCyHsog8HssnhekfkbKKfZ6"
81
+ hotkey_path = os.path.abspath(os.path.join('/app', 'hotkey.txt'))
82
+ with open(f"{hotkey_path}") as f:
83
+ text = f.read()
84
+ text = text.strip()
85
+ if text!=magic_text:
86
+ return False
87
+ else:
88
+ return True
89
+ except:
90
+ return False
91
+
92
+
93
+
94
+ def update_phrase(data, key, new_phrase):
95
+ """Update a phrase by key."""
96
+ if key in data:
97
+ data[key] = new_phrase.strip()
98
+ return data
99
+
100
+
101
+
hotkey.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ 5G4aypoqAPqhTvUVYBEd8P9c8q6UzELQFdBamke8yS1GCs1T
lighning.py ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ # Pyarmor 9.1.8 (trial), 000000, non-profits, 2025-09-15T06:24:46.486265
2
+ from pyarmor_runtime_000000 import __pyarmor__
3
+ __pyarmor__(__name__, __file__, b'PY000000\x00\x03\n\x00o\r\r\n\x80\x00\x01\x00\x08\x00\x00\x00\x04\x00\x00\x00@\x00\x00\x00O\x03\x00\x00\x12\t\x04\x00\x0bo}\x11D\x123{\xfc\xa5\x0e\x9f\xf3!}\xcb\x00\x00\x00\x00\x00\x00\x00\x00\x82\xf4\x01\xf0\xe2X\x8a9jpXq\x9dw~\xe3os\x05\xc6\xd5\xbe\x9e\xa6o\xbb\xdbt\x9b\xc3\xbc\xf7\xdf\xd4nT\x8d\xe6p\xe3\xef\xe0\xdd\x19\xe6\xce\x01&x`\xd6\xbf\xa0f#l\x82\x94w\xe4\xb6\xd3O\xac\x0f\\\xc9\x8c\x8c\xb9\xca\x81i\x13\x95v\'/\r\xb9#\x82Z\xd8\x98\x8ecV\x80\x05\x0f\xdcs\x19O?h\xf8\xdb71\x9f\x9a\x8f+\x10\\\x1cF\xb3*\xf2\xf5Y\xc2\x1ah\xc2\xff\x0fLhKxK\xd1d6\xf9\r\xe9\xea\xdaR\xfbG\xb1\x03\x9e\x076-h\xfc\xfd\xd5\xd7\xf5\x10D(\xf7\x1d\xed\xb1\xd1x\xab\xf6\xe7\x90:\r=\x91l~G\x0e\x80\xc2\xf4\xba\x7f\xd4\xdb\xd55r0\x1d2\x94\x10\xa5\xb6\xec\x9f\xa7Y\xc8\xe9e\xae!\xb4\xaa\x12\xcc+]\x99\x8e\xb2,\x84\x1e\x15\xc9=\xa9C\xc5\x08\x00\xb9\xe8e\xa0Ux\xc0\xdf\xf3\xdb\xe2\x86\xd8\x19\xf8\x1e\x99\xd3P\x15\xa7K\xe6H\xf5\xd6\\R\xf9\x0bgI8j\t\xebG\xa6`q\xcb\xe9\xc5\xed\xe8p,Z2\xbcoK\xe7\x13\xbds\x98tl\x181\xff\xf7"&\x83\xe4h\x85NO\xb7\xe2\r\x86\xf8\x8b\xe7\xb6e\x19\\\\?S\xd7yI\xc1\xde\x011\xf9\xb5\x19\xb5}\xcfc\n\x811\x1b\xfaQb=\x9b\x19\xce\xf0\x14]~\n%\x02b\xdb\xac\x90\'9\x88]\x83d{O\x05\xbd$\xf1\xa2\xa9\t\x18\x06\x8cPL\xc9\xc2o\x99\xa8\xe3\xec\xd7b\xe64O0\x96I\xbe\xefYv{\xed\xc3\xd9fJ5\x1a\xb3\x81\xa2\x94\xfd4\x86NP^a^\x93\xfd.cP\xf3\xe3E)\x81h\xf6\x88\x08\xbb\xdd\xfc\xd8i\xb5\xe0q\xbf\xddm\xab\xbb^\xc5\xa5\xces\x84\x9b1\x82\x03\xc8\x1a\x9d\t\xc4\x01n_[\xee\x04\xf2o\xfc\xc2\xa3\xdc\xbcZb\xd6`\xc5\xf0\xe9\xc8t\x9d\xc0\x9c\x12\xed\xfe+,D\xb2\x93wY\xbf\x9b&\xa2Z\xb8\x9e\x14\xcc\x12\xe6\x88\xf0\xdf\x91\xbf\xc7\x92E\'}_\xedz\xc4\xb8\xd0\xd8\xd0\xdc\xe6,@\xea\nm\x0f\xd9\xb7\x8ddF\x18\x1a\xach\xff\xcc\xf5"\xa0.\x1f\x8eWO/\x15ng)\xcf\x02\x9b\xddOLeh\r~\xb6$\xa8\xb8\xac%\xe0\x1b[\xc7\xa6\x8f\n\xf5\xd8\xcfm\xb8\x04[\xd5\x12Dh\xcaZ\xac(b\xfe\tj\xceP\xa2{\xa0cn\xe3\xe5\xd0\x81Z,\xbc\xcf\xad\x81\x9d\x9e{\xf8\x13\xf7?\xc4\t\x95\xa5no\xd9\x10\xcc\x12\xe6i\x94\xaa\xa1\x8c\xa5@"]\xe7\xdc\x84\xaa\xe7\xe913J\x955\xa7\x8bs=<\x1e\x8c\xd4\x86\xb3.\x93\xb9\x91g\xccH\xbe\xac?Z.\x8d\x96\xb1\x812\xf0\x16\xbe\x96\x911H"=y\xfeR\xe2\'\x15\xde\xceS\xaeH\xfb@\x85\x96lw\xd2\x07\x8d\xeb20)\xb3^;\xf8\x1bH\xc3zk\x18\xa4[\xa7R\xd7\xc6P\xe7Z\xf7\xc8\x89\x9a\xbe\x9f\xf1\x92/S\xd6A\xf3:\xa3\xb6\x88\x05g\xda^\xe6y$\xcc\xe4E}\rdG\xafZ\xf0\xa1\xec\xec\x1e\xa3\x7f\x17\x7f\xb8\xa0\x8f\t\x96&\xf7\xbf\xcazaSJ\xda\xda\xc0\x11i^7L|P\xb3\x95\x8fM\x9e\x9eZ\x9f]\xb3\x90\x0ex)\x86{Ju\xc58]\xac,\x9a\x91\x02$\xc5|\x9a\xce\xeb\x8d\x8b2e\x86\x9f=\xc5\xc7\x04\xa9\xe2\xc0\x16\x99H\xe4U=\x1b\x89\r\x8e\xa6\xfd\x8cy]\x1f\xe1e*\xf2\xc6:g\x96\x96\x92\xf7\xe0')
models/Llama-3.2-1B-Instruct/config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "LlamaForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 128000,
8
+ "dtype": "bfloat16",
9
+ "eos_token_id": [
10
+ 128001,
11
+ 128008,
12
+ 128009
13
+ ],
14
+ "head_dim": 128,
15
+ "hidden_act": "silu",
16
+ "hidden_size": 3072,
17
+ "initializer_range": 0.02,
18
+ "intermediate_size": 8192,
19
+ "max_position_embeddings": 131072,
20
+ "mlp_bias": false,
21
+ "model_type": "llama",
22
+ "num_attention_heads": 24,
23
+ "num_hidden_layers": 28,
24
+ "num_key_value_heads": 8,
25
+ "pretraining_tp": 1,
26
+ "rms_norm_eps": 1e-05,
27
+ "rope_scaling": {
28
+ "factor": 32.0,
29
+ "high_freq_factor": 4.0,
30
+ "low_freq_factor": 1.0,
31
+ "original_max_position_embeddings": 8192,
32
+ "rope_type": "llama3"
33
+ },
34
+ "rope_theta": 500000.0,
35
+ "tie_word_embeddings": true,
36
+ "transformers_version": "4.56.0",
37
+ "use_cache": true,
38
+ "vocab_size": 128256
39
+ }
models/Llama-3.2-1B-Instruct/model-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:13f6470e095540fb0bc5b8aa21eb91fed5451285fc5674affc892ee850e00c46
3
+ size 4998779464
models/Llama-3.2-1B-Instruct/model-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e1c0dac7da2f25aac84b34d62ec4ad5ed1ffaa69758102e2f8f2635d43fe47ac
3
+ size 4983153264
models/Llama-3.2-1B-Instruct/model-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:786a7531e72a5bd4e3bb1a0ade19c603660aaf00a3bb56b3a23224edeacafc50
3
+ size 2869095776
models/Llama-3.2-1B-Instruct/model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.1.0"}, "weight_map": {"model.embed_tokens.weight": "model-00001-of-00003.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00003.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00003.safetensors", "model.layers.16.input_layernorm.weight": "model-00001-of-00003.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00001-of-00003.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00001-of-00003.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00002-of-00003.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00002-of-00003.safetensors", "model.layers.27.input_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00002-of-00003.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00002-of-00003.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00002-of-00003.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00003.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00002-of-00003.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00002-of-00003.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.input_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00003-of-00003.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00003-of-00003.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00003-of-00003.safetensors", "model.norm.weight": "model-00003-of-00003.safetensors"}}
models/Llama-3.2-1B-Instruct/special_tokens_map.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eot_id|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ }
16
+ }
models/Llama-3.2-1B-Instruct/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
models/Llama-3.2-1B-Instruct/tokenizer_config.json ADDED
@@ -0,0 +1,2062 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|finetune_right_pad_id|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_2|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|eom_id|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|python_tag|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_3|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_4|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_5|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_6|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_7|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_8|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_9|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_10|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_11|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_12|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_13|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_14|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_15|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_16|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_17|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_18|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_19|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_20|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_21|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_22|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_23|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_24|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_25|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_26|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_27|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_28|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_29|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_30|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_31|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_32|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_33|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_34|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_35|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_36|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_37|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_38|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_39|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_40|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_41|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_42|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_43|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_44|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_45|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_46|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_47|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_48|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_49|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_50|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_51|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_52|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_53|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_54|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_55|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_56|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_57|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_58|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_59|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_60|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_61|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_62|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_63|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_64|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_65|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_66|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_67|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_68|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_69|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_70|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_71|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_72|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_73|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_74|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_75|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_76|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_77|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_78|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_79|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_80|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_81|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_82|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_83|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_84|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_85|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_86|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_87|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_88|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_89|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_90|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_91|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_92|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_93|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_94|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_95|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_96|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_97|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_98|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_99|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_100|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_101|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_102|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_103|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_104|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_105|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_106|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_107|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_108|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_109|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_110|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_111|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_112|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_113|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_114|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_115|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_116|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_117|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_118|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_119|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_120|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_121|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_122|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_123|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_124|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_125|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_126|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_127|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_128|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_129|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_130|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_131|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_132|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_133|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_134|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_135|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_136|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_137|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_138|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_139|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_140|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_141|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_142|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_143|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_144|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_145|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_146|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_147|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_148|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_149|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_150|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_151|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_152|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_153|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_154|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_155|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_156|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_157|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_158|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_159|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_160|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_161|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_162|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_163|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_164|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_165|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_166|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_167|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_168|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_169|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_170|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_171|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_172|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_173|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_174|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_175|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_176|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_177|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_178|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_179|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_180|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_181|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_182|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_183|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_184|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_185|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_186|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_187|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_188|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_189|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_190|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_191|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_192|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_193|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_194|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_195|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_196|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_197|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_198|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_199|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_200|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_201|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_202|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_203|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_204|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_205|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_206|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_207|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_208|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_209|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_210|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_211|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_212|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_213|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_214|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_215|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_216|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_217|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_218|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_219|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_220|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_221|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_222|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_223|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_224|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_225|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_226|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_227|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_228|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_229|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_230|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_231|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_232|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_233|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_234|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_235|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_236|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_237|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_238|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_239|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_240|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_241|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_242|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_243|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_244|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_245|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_246|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_247|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ }
2051
+ },
2052
+ "bos_token": "<|begin_of_text|>",
2053
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- if strftime_now is defined %}\n {%- set date_string = strftime_now(\"%d %b %Y\") %}\n {%- else %}\n {%- set date_string = \"26 Jul 2024\" %}\n {%- endif %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message #}\n{{- \"<|start_header_id|>system<|end_header_id|>\\n\\n\" }}\n{%- if tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\\n\" }}\n{{- \"Today Date: \" + date_string + \"\\n\\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\\n\\n'+ message['content'] | trim + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {{- \"<|eot_id|>\" }}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }}\n{%- endif %}\n",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|eot_id|>",
2056
+ "model_input_names": [
2057
+ "input_ids",
2058
+ "attention_mask"
2059
+ ],
2060
+ "model_max_length": 131072,
2061
+ "tokenizer_class": "PreTrainedTokenizerFast"
2062
+ }
models/wpt/wpt.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9ecf779972d90ba49c06d968637d720dd632c55bbf19d441fb42bf17a411e794
3
+ size 483617219
pyarmor_runtime_000000/__init__.py ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ # Pyarmor 9.1.8 (trial), 000000, 2025-09-17T15:38:00.454139
2
+ from .pyarmor_runtime import __pyarmor__
pyarmor_runtime_000000/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (241 Bytes). View file
 
pyarmor_runtime_000000/pyarmor_runtime.so ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21fdaee2fb692d4bf6efdaf55079b6cf54d8248e2bd3077798b7cb662eeb097d
3
+ size 792360
requirements.txt ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ transformers==4.48.3
2
+ pydantic==2.11.4
3
+ numpy==2.2.5
4
+ torch==2.4.1
5
+ torchaudio==2.4.1
6
+ torchvision==0.19.1
7
+ outetts==0.4.1
8
+ fastapi==0.115.12
9
+ uvicorn==0.34.2
10
+ librosa==0.11.0
11
+ openai-whisper==20240930
12
+ soundfile==0.13.1
13
+ accelerate==0.26.0
search_beam.py ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ # Pyarmor 9.1.8 (trial), 000000, non-profits, 2025-09-12T14:31:07.766429
2
+ from pyarmor_runtime_000000 import __pyarmor__
3
+ __pyarmor__(__name__, __file__, b'PY000000\x00\x03\n\x00o\r\r\n\x80\x00\x01\x00\x08\x00\x00\x00\x04\x00\x00\x00@\x00\x00\x00\xd0$\x00\x00\x12\t\x04\x00*t\x90)\xb2\x8as3\x1c\x08G\xcc\xd3%\xd4`\x00\x00\x00\x00\x00\x00\x00\x00Z\xd2/\xf9\x8f0\x85\xdf\xe9\xdbi\xb0u\xa8\'^2d\xbc\x9d\n\n\xac\xdc\xf9\x8d\xb9\x9a\x8e\x90\xbe\xdd\xf3JSg\x1a\x04\xf9\xa8\x06\xbap\xc6n\xa3\xf1\xf9:\x1f\xd5_!\xe3=\xe5\xa5\x08\xb7\xe9x\xbb\x8bd\x89\xe4\xac\x17n\xfdK\xe8\x06!XO\x90\x0e\x1c{\x8f\xd5\x7f\xa3g\xf6H\xc4\xfcS\xcb\x84\x8e\xb5b \xa1\xeb.\xaf)6|\xb1\x17\xf6\xb5\x80\xc0\x19\x163&z\x8b\x19lU\xa5"\x96\xf4\x10\r\xb4A\x92S\x84;\r\xe5\x8dS\x08;\x896\xc7\tul\xd5\x9d>\xe8\x92Y\xa4HC_\xb61Q\t\xe5\x0e\x80K\x87\x1c\x88\xe3\r\xc4\xc4(\xa4S\xf7\xc9u\x1d\xcc#\xf2\xc5\'F \r\x8b(\xef\xcfZ\x12\x15\x05\x8a*^n\x8bX\xe5W\xa40\x86>\xd6\x98\xbcC\xda\xf9\x85,\xb2\xefK\xb5\x1d=\xa6\\\xb1\x9f\x0f\xfb\xc1F}p\xd4\xdc\xf4t\xf3\xc8#\xe3\xe4>n\x82\xf8t\xeb\xe9\x8eu\xb5_\x1a\xae\xdcp\xc0Zu\x17\x87A\x90\n\xb0L\x0f\xea\xd7\xa3\x1e\x9a\x8e\xb0wr\xfc\xae77\x94\x89P\n\xaf\xc7\xfb\xd2:\xe4\x14\xdd\x9d\x07\xd1\x1c\x93\xd71`0\xbbz\xee\x032qZ\xeb\x10U\xf4o\xa9\x7f\x1e\xa0\x98\xf9\xf5\xaa\x13\x84]\x8aV\xdd1\x9ax\xc8jX\x97\x196\xfb\x95\xd4\x9elv\xf8\xc2K\xd9\x1f\xa7\xa2\xd6&\xbegf\xd3t\xafuw\t\x1f*\x18\x0cj\xc7\x88\xbd\x1dy\xcf0\x03\xd4]\x10\xd3\xa1Eh\x02\xe4m \xb5\x97C\xa8\xc0\x9f\xc5\x1d\xa3\xcc}II\x87Sq\x0b\x9a\xc1\x07\xcd\xcc\x7f\x19\xf2\xae\xb2\xff\xd9\xdd\xae\x15\xc9\x9d\xc0\x18\x95\x13\xdf\xe2\xf9\x99a\xces\xc3\xbd3\x8c\x1b\xb2\x81M\x84\xd66\x10\xdf\xd17?\xe7\x93\xe6v\xab\x06<\xa1\xb5~`s\xe4\xb7\xb0\x9a\xff:\xad\x06k\x93\x8f\x97\xbf\xe2\xe7;+\xb7)K\x1b\xe8#\xccX\xaf\x84\xa8\xccU6\x85\x84{\xba\xa3\xc4U\x94\xf9\xd5\t\xb5t\xb8\xa3\xc3p\xf3\xc2\xfd\xb8\xf1\x12\xa5:\x11\xd0 \x86\x06\x9dlU\x14\xc7\x95\xf9\x85L\x14\xea\x996E\xfb\x01\xbc\xa0\xd5F\xcb\xa7:\xc5\x0b\n\xb4\xf9\xb6\x14\xbeT\xfd\x16\x9aAV\x18!V\xf8\xca1\xf4\xfc$%\xe8\x84\x9a\x88c\x97q,\x04\xd9O\xdc\xa3\x10\x8b\x10\xcf\xe8\x91*\xc6>\xaa\xf2\\u\xa0\xbe\x1b\xc5\xc4Zw\x921{\xf1\x8d\xa2 \xf1\xb7\x95\x89\xcf\xbfRm\x88\xf8\x8f\xd4C\x02;\x16If\xe3\x18<-\x9dS \xea\x16\xcd\xa4w\\\xadJ1\xd3\xaa"\xff\xc3Y\xc2\x07\xb4J \xc2\x10\x14v\x1bT\xe1Bm\x12\xa5\x0c\'\xd6\x96\xfd\xc0\xf1\xa4\x08\xd0\xfb\xe0)\xdd\xaf\xcf#\xc4\xa2\x88j\xae\x87\xd2\xd1mi\x8eX\x1d}\x06+j\x91\x12\xae\'A\xeb\xb9\xe0\xbfE\x01+)\xf6X\x13\xfaK1\x93D\xb1|\xb8\xf9\x89\x1cU\xf7\xa6\xb0kfL!da\xd5%\xc5\xd7\x1b \x82\nT<\x9d(\xd3\x9f\xe5\x03k\x8a\xac\xcd@\x82\x17\x05j\xe2@\xfc|\x8e\x8c\xe3\xb8\x9e\x0bD\x17\x1a\xca\xab\xa9\x88\xf0\xee\xb4;,\xed]\xc8\xd2s\x84\x88V\xd8\x17\xb3\xca#\xef\x15\xed?D(\x93s\x07\xa2b\xae\xbc)\xd9\xb6\xd2\x7f\xcd\x8c\xa4>=\xcb\\\xe6\x88\xc3\xaa\xfe\xb9\xa6vVa\xca\x83\x97\xc0\xe3\xe7\x9b@c\xd9\x91\x0e6\xf4\xa4\xff\xdc\x051dd\x0be\xb2"\r\xb0\xb6 \x9c\x8d\x0c\x12\xd2\xb7T\x95\x12KSi\xa2\xa9\xd8\xce\xe6%\x15\xc2\xb7\x8d\xc5\x05\xbc\xaa\x1fJ\xbfSHW\xed,S\xf1\x0ebu*7Z\x82\x8c%#qy\xb7\x04\x96\xff\xa1\xa4\xdf?\xa6r\xe0\xa8|Dz\xdf\x14\x05`\xb3\xf4\xb1kH\x08\x13\x14\xbc\xfcQ9\x9fy\x85\x81Y\x84R\xb0;\xcb\xb7\xe9z\x19\xea\x1e\xee\xcd\xf8\x0c\x9f\x12\x8c\xb64z\x13\xba\x08L\xd4\x82\xf3l\tbi\x9b;\n\xeeW\x94\x03^\x7f\x96\x89\xf9\xb0\x03\xf2\x07zd\xd7\x01bu\\e\x12";\x077\x83\xf9\x89\xfdI\xc0O\\\xe7\xa9$i\x8b\xfb\x89-\']S\xf7,\xa5\xfb\x04LCQC\xae\\\x90a\x80\xba\xdfl\xd4"\x0f\xa1\xffI\xf4Y>\xe5\xd0\x86\x91\x91\xbf2X\xfd\x97\x02\xc3\xd6P\x1fF2t\xa2$\xcda\x01\xac\x8e\xd1\x1f\x90\xe9\xc5\xf7\xaa\x86\xd0\x9b\xb1\xd6\x9b\xa9\xdf2#\xf4\xd1\xc3\xd2O|&\xef\t\xfb\xc1\xdfe\xd4\xa0t"\xb1\x87g\x1a\xf5aOf\x93\xf5r\x86R\x80N\xd7\x812]\xcbGv\xffD\xd8\xa8\xc3\\7\xf1C\xf5\xb2\x829WB\x85\x03\x13\x936\xf8\xbb\xfew)A\xda\x96q\x99\xd2\xa4\xd6\xf9\xf1\r\xd9u2c\x85ea\xfcr\xffd\xa0\xa4\x95\xc0\xd6\xdb\x9fq\xf1\xcf\xd5t\x93S\xb7\xe8\x93\x0e\xb2\xdb(\x8c\xea"Z\xe8C\xb6\xf0\xe3Z\x11\xbe\xec\x8d}k!<2\xd1\xc2!\x9aw\xfa\x03\xfaB\x87\xd2\x07\x8b7*!\xb7\xca\x1f\xfe\x13_\xa9\xe2I\x02e\xe1\xfaZ#\xdd\xf7\xd6\xe5\xf2:\xf2\xe6/LO"\xd5Tv\n\xebE\xd3\x98f=\xe4\x01}F\xb7\xa6\xb7\xfa\xbfw>7\xba\x84a\x19\x8f\x0cN\xd2f8\xdcaX\xe9\xb1?*\xc22\xda\x01\x15\x0c\xf9?\xc8v\xa2\xdc\x83\x08\x83\xd6\x86\x7f%\x98\x13\xf5HK\xc9j*5\x00\x1b1sT\x05\xc8\xebP!a\xf7x\xfc?\xe5\xf9}\xbd\xae%h4\xde\x8c\xb8D\xbe\x19\x18\xef\xfc\x9d\xa3X\x86\xc4z\xe8\x08\xd6l \xaaV\xe0\xf6\x0b3\xc2\x98J\x90V`\'\xe9\x02\xa5\xd7\xaf\xb8v\xb0\xbd\x95\xaf:j\x84\\\xfa<\xbe\xdf}f\xbc`\x03)\xdf\xba~\xc3\x8fe\x0b\xc5\xf5\xd3!#|\xcc>\xc8\x1f\xa8\xbb #\xbd\x7f\x12\xa0\xbe51\xcb\xfd\xcdh\x19\nNo\nzeX\xe8\x89b\x89D\xfab?\xc0d\xe5>\x98\xc1\xf9\xca\xd8tz\xa6\x8c\x89L\x9b\xccu\x08DC\x91\xc6wE]\x10\x856h;\xf8N>~\xf6\xb2t\xc4YL\x1b\xf9y\x02\xde\xd2\xa9t`R\xe5^\x8d\xfc\xa6i\x7f\xbe$\x92\xa1\x1b_,\xb7T\xca\x17\x08\xb5\xe3\xb8K,\xa6\x02\n\x14\xe8y\xd3\x84iH\x84\x10\xb9\x17\xb4p\xe9)H\'\xbe\xc8\xb6\xb1\xd5\xe7\xba\xc3,\xc8#\x16=R\xc0\xb6\xbd\xae\xf16\xce7v\xa4\xb0\xd0`\x83;6\x1bFFN\xe9\xc1\x11 \xca\\_-\xb2`.\x0f\x1c-\xc1C\xb4\xf85\xd5\xa9\xa1~vd\xe5\xbd-\xa6\xbb\xdf\x8e9c\x9a\x1b?\xda\xbaa\xaa\xc7\xfe\xe0K\xa2\x85\x85\x97t\xdb\xb5\x9b\xfb\xdd\xc8t7HW\x82\x81\r\xd9\xd1\xe1\x9c\x8a\xb2\x06/=\x84W\r\x8d\x82\xd0\xfc\xa4&\xaaF0}\x1f\x05\x03\x0c\x9e\t\x87\xa370\x8b\x0b^\x16f\x98\xdeA\xa0\xb5\xbd\xef:0\xa0\x07z5HS\xbf+\x8f51\xcb\x9a\x93q\x02N\x95\xf6\xf2c\xaa7\x02H?^\x8e\x94\xbb\xea\x1bL\x07\xf3\x8e\x8fs!q\xcb\x1c\x16\xb7\xc1\x9a\xe9\xe1\xb1\xae\x03\x85\x00\x80V\xa1\x87\xda\xec\xab\xcd\x0f^\xd0]L\xfe\xb4"\xdb\x12\xae\x8bZ\x11\x8b\xa8|\\\x05#\x05\x0f\x9emF\x1d\xc1D\x80X\xf1\xc1\x1a\xba"\xa5\xbb\xfeq\x98r\xb7\x96\xae\xbbM\xfe\xea\x8e!\xa7H1\xd81\xa8O\x19\xe5T \x96\xe0\xe8d1\x94o\xef\x02\x10\x88e\x1d\x0e(\xf4F\x14e\x0c\xc8\xff\x17\x0cW\x1b\xc6\x8d\xa1O\xe3\x14\xb9\r\xd0{\xb2\rD_\x0c\xfd\xbcp/\xa1Z\xa0\xcb\xb8\xac\xc7g\xb9F\x97\xfd.\x15qp1_\x89\xd1\xaa\xf0\x97(\xe6\xf3\xd3\xdb\x134\x9eX\xab\xad\xee\xe7<e\xd8d\xa3\x0c\xb9\xc7\x16,xxNz\xe6 /i\x87\x07AQ\x9cag\xf1v\xad\x98\xc1%\x1b\x83\xca\xd2\x91B\xb2\x95\xdcr\x97\xb7\x11\x0fx\xd6m\xf0\xc4\x12\xfas$\xd7\x8d\xb8~\x18\xc5\x0c\xe3\x15\x91\x05\x81\xd8-Y#JY\xba\x87NiV?\xfaA\xfe\xbd\xee\x0bv\x8d$\xe7\xc2\x9c\x04B?\xbf\x0c6\xa2\xe1\x94OA\xba&\xa4\xef\xff\xf2S.=\xfa\x1d\xa9ngE\xd3\xf8\x13\x83;\'\x94\xc5\xbe\xa6\xc55\xa5\xb0\x8c\x97\xa2\x11\xd4\x9dce\x9b\xeb\xa0\xc6\xb8\xb6l\x8f\x1cL\xf9\xb3\x81H\x8a\xc0\x8di\xdf]\xb3B}\x04\rp\x9a\x13Gz\xc15p\x8b\xdd\xe8\xcf]\xd92\x95d\xfc8\x1a\xde\x95R\x8d\xearV*\xee\x02\xba\xac\xa9!4\xe8*\xc0k\x9d\xdd\x10\x90\x1e\xd5mt\xd0\xbdR\xd7g\xccyV\xa0\xcd\x0c\x08\n\xf7\x94\xcb\t\xbe\xcfc\xba\x81!1\xe8\xe4\xf7\xaa\xafe\x91\xcd{\xd3\xf4\xb9\x13\xc2\x06\x0b\x17\xfac\x83q\x95\xc4\xe8\xd1\x16f\xe6\n9B_\xa5\xc5\\B$\x05\x12H\x97\xb0\x12\xac\x11\xa4\x96\xc8\xba\xc8\x82jM\xd9\x8af*\x1a\xf4\xedPQ#\xad\xfbx\xeb\x18I\xa7j\xf4p\x07*d4\xc4(\x9b\x1c \x15#[u!*Nk6_\xec\x8a\xe1\xb3\xe3@Nk\xda(\x01#G \x0fm\x14\xed\x11U\x07hw\xff\x94g\xb0\xcc\xa7\xf4\xe7Q\x1e\xfd\x91\xfb\xa1\x1f\xd4\xb9\xe2\xc5\xbf\x7f\xc3|\xb2\x17\x99\x94\x02\xc1.E\xbc\xc9\xfb{\xdc\xb4\xfe\x0b\xbe\x89\x02\xc0|\xee&\x0b\xe0\x87\x9fm\x02z\x1e\x9a@\xef\xaf\xa9k\xaa\xe6\xdf\x0c\xe9\xfc\x07\x1b|R/\xb6J\x0cdb\xf6\xf5\xc4$\x9e\xd9\xe9\x10]\x98\x12To\xed3\x12\xb6f\xda\xf4~i)mF4\x01\x19\xf1A\xb3\xbe\xe8\xeb\x93\xaf\x8a\xb2\n$\x06\xbdY=V\x12\x7f\xe6!\x00y[P\x81o\xcf\x8b\xa6\x82A\xf7\xb2.\xc9j\xce\xae\xa0N\xb0\xa5\xcc\x0bW\xea\xc2\xab\xd4R\xfe\x05\xec\xbfW\xb0\xa8~\xf8\x93Q\xe1A\xd6\xb5\xe9\xcc^\xcf\x91\xfb\xf5\x02\xaa}\xde\x97<\xb3\xed\x90\xa3\xfa\xcbQ\xe1\x80_\x82\xfe\xdf!T\xd6\xbf\xf8\xf6\x83\xb6\xceO\xccTx\xae02:\xcb\xf8\xe1\xda\xaa\xdc8\x17\xec\x97\x92y(\xe0\x12f\x9dG\xdb\x0fx\x16S\xde\xd9\xde!\xfe\xe2s\xed\xe7\xec`)L\xd8\x94\x97V+c)\xa6\xf3\x97\x8c\\\xed\xbd\xb1Y\xe5g[\x0c\x0b\x04\xdat\x01\xb0\x83\xa1\x96\x9b#\x95\xbbEr\x19\xa1\xd4$\xa4"^\x88\xfb\xbf\x08ID*Q\xd3rc\x10\x93a\xa7\xd1\x1de\xb0U!\xa2\x81dL\x1f<\xd1\x9d \xff\xa14u\xf4N\x9e\xc6#\x8a\xe38\x803{#\xe3\x0c\xdf5\x88\x13\xcdn\x94\x1d\xd0\xe4=yq/j\xcar\xd4cN \xaf\xca\x98h\xfa\x8bA|\x00\x17\x03)Yf!\xcbB{\x1b\xbf^y"\x0b\xc4\xde\x9e\xbe\xb5:\xb5\x0f\x1e\xb2\xb1\xec\x10\xe5\x80\xe1\xe0\xf92\xe8\xe8\xe9\xc2i\xb4}\x1d3G"\x9f\xaf\x13\xa8a\x14\x9awg#\xd5\xe5\xa9\xfaN\xd8P\xdb\xd2\x05Z"\xde\x98"\'\x0e\\\t\xd3\xd6g\x07I}\xc7\xb5\x7f3|\xb3\x07\x01\xef\x81\xe2\x87zZ \xeb\xb0\xa0\x11P\x8b\xf7\x9e\xca\x9b\'\n5\x193\xae\xe9j\xed\xd7\x0b\xb4,NF\x8fg\x0f-\xfd\x93\x06\x01}\x8bq\x84\xb8\xf3N\x15v\x968\xed\xbe\xce)\x7f\x9e\xdf\x86\xa7\x1e\x873y\xe4\xfa\xb5-\xb5\xeb\x8a\x01\x07p\xab\xb8\xe8^\xb8\xe1i\x9e\xaf\xfc\\\xd0k\xc9\xa4\xd3\xbd\xcf\x06\xe9\x98\x19\xbf\x95\xa3@{\xfa\xf4E\xb9A\xce\x13Tv\xf1F\xf4l\xf6\x02r\x9b\x10\xba1\x8e\xf9\xa5r\x91\xc5r\x02L\x99B\xb5\xd6\xaa\xff\xf5\x07\xa6b\x8bv\xe4\xb2%\xa8\x85\x8a\xa5v\xad\x84+\xe7\x07*\xd9HN\xbe.\x94\xcb\xc3\x98:\x8f\x85=RE\xce\x95\x9b*\x9a\x9f\\\x1aF\xba\x99u\x1c\xbf-\xb1\xe8\xfa+\x01\x02X3F\x87\xc1\xecO\x9c}|\xc6^B\x9a\xba\x16\xa4\xe6\xbcG\xc0\xfd\x04\tP\x1d=#\xc0\xc9=\xcd\x03\x9e\xc8Z\xde\\;P:\xce\xcf\xd7\xe7\x8c\x10\x12^W\xd1)\xa9\xde\xc1.\xa7{\xdbu\x94c\n9\xc2\x08\x1f\x9c\xd5+\xacl\x0eO\xeb\xd3\x95\xa1\xa4\xdd\x80d\xfa\xbfDLyc;\xa7\x97\x88\x99\xb6;\xda\xd1\xaeR\x0e\x05\xdd[K\xd1\xc2Jn&\x03\xceX\xfe\x1d%\xe8\xe3fg\x9f\xd9JZv1\xef\x14\xeb7&Q\x01d\xe9\xfdr\x0elj)\x15\x83\xc1]\xbb\xc8\xe9:\xbd\xe9\x05\x14\x8d\xac\xa9\xcc\xcb\x1fW\x87\xf8H\xd9r\xbd\x1c\xafN\xe6r\x96\\A\xc0\x81\xba\x03\xcc\xd2\t\xac\xf6\x998W\xc2\x84\xc9\xb2\x10\xb0\xec\xad\x98\xd2qG+$\xe5\xf6\x16\x12&\xdf\xe4\x9bt\xdf\xb3\xa4\x94\x06C\\KG\xa3\xdf3\x95\x99\xfa\xbe"\xda\x90d\x90\x89.\xdf!\x1c\x052\xcc\xfb\xb8xU\x1cU\xdfD0,_\xbfN6A\xcdQ\x97:Y\x98d2\xbb\x90\xaf--\x02{\xdc\xd7\xbf$\x01z\x13O\xbf\xa2\xf5\xac\xc9\xa7t(Z\x8a\x1c\x87H-\x84f;\xb2J\x99Z[N\xceJ\xe0;I\xe1\x99\x19\xd9\'\x93B\x9be\xa4\xddN\x08\'\xea\xd4\xaa\x07F\xadg\xdd\x04\'\x00\xc2g\xb3m\xdb\xc8\x14\x18\xf4\xc2\x12\x05\x18K\n\r\x1b\x00G\x89(|B\x1e\xe9G\x9e_\x91g\x8a\x9dH1\x05\xfe\\\xfd\x8cb\x13!\x1e\xb6\xaf-\xbe?\xe6\xc7\xb8\xad\xd8<M&\\\xd0\xf0\x1a\xc5%\xda\x97P\x9c1\xc6\xa5o\x13\x88\xf6\xb5\x98\xc7\xbc\x0fy\xdc\x04\xdf\xa7ss\xd1vTs\xc47\xfb\xdeeW\xdd\xcdX&\x19\xc1\xe3E\x0e6\xcc\xb0\xfbqA\xd9d\x18\xafg\x9a9\x86\xe5\x95\xe2\x90\x1e~\xce\xcf\x83\xb5"\xb2\xf1\xe5\x13\xd0\x98<\xea\x1d\xd3\x91#\xdcr\x1bNtY\xa1\t\xca)\xb0\xce?\x80\t_ \xbe&=\x80\x90\x82\xe3\x1b\xa9\x07A\xd2\x97\x94\x08$\x0f>8xr\x7f\xa6\xad\xfe\xc1\xbf\xb1\x95*\xf9=\xc7\xb2\\\x0b#\xd5\x84\xf9H\x84Nq\x98\xf7)\x1e\xbdkv\x02\x04\x92\xeaQ\xc0y\xa50\xc2\xf4$x\xe1H\x85U\xcd\x8a\n\x1a\\\x01\'\x8b\xb9x\xabdY\xc3\x9e\x10\x9aF\x13P\x10M\x1b\r\x01&[\xd7\x151D\xcb}\x02+VA\xbc\x13\xb3\xd1N\x94<\x04\x8e[\xdc\x16\x7f~\xa3\xdco~\x1b\xd1\xa2\xd2\xd2g\xa4\xce[\xaa\xbd\xf8}-\xd0`\x8d \x14\xb6\x86b\xb4\x84e\xd9\x16>-[e\xc8X\xb3OTy\xc0kG\xff\x9e\xc1]k\xa7\r\x91\x17K,\x97\xc3\xd8\xd4\x88\x8dP9\xb7\xb0Q_\x82L\x07&U\xe1\xcb87^Q\xa8m\x97te\x0c\xb1+\x06B4\xfd0\xc2\xb8c\x8e\xb6\xb9\x93\xecm\xd7k\x16f\xf9,\xd6\xceH\xb1\xb8kkEX\xd2c\x1b\xd3\xf5\xeb\xb4\xa620)\x9aJ\xa2\xec\x8b\xba\xaa\x19\xd2\xf76\x8dR\xd4\xb7|\xa7\xd9\x8d\x18\xfd\xc3\xb7\xa4]\xa2\x9f\xf6#\x90\xf0\x10f\x02\xedbV(*\xf6\xac7\'y\x8a\xd5\xd3X\x0e\xa8\x1f\x87\xde\x87\xc3\xb9\n\x08\xe6=\xe4(\t\xa6\xb8ep@\xcb\xe3m\xd8o\xf0\xcaP+\xb5\xca3\x99P\x15\xd6\xf4\xa6\xa3b\xc0b\x93(\x82\x07\xd8l\x01\xe6\xe0\xaaU\x9c\xba\x9b\x13O_\xd7\x9d\x10<\xaf2\xe1\x02\xe6\xff8~}\x9a\x17\'\xa4\x88\xfe\xa2\xd9\x9c8\x88\xd2\xc0G\x0e\xf5\x86\x9ee\x93V\xa7\x17\xee\xba=\xbfP\xc3\xe4\xd5\x84\x93\x96\x9ctU\xc7!4\xd7\xf1\x8b\xfcq\xd4\x15\x04T\xe1x9e/\xd5\x813\xc6\xed\xa0\xf7\xb0SJ\xa1v@\xaa\xa6\x06\xecrK\xb6\x06-\xef\xd7t\xb4\x10x@H\xb2\x9d\xd7\x0f\x08\x8b(\xb3,\xba\x195i\xc4\xda\xb5\xd3\xcf\xa1\x08H6\xcc7Ea\xbf\xfb\x84\x84\xe0\xa0\xaf\xdf\x9c\xe3\xdc\xd7F\xce\x89\xc9\xab\xec/\xdb|\xdaW\x8a\x9dG\xa6\xffu\xc3\x89\xd9\xee\x86$\xa9\xa5<^xP$\xe5l\xab\xda\xa3\xaf\x80|\x8e\x8b\x14\x91\x97\x9b<\xb7\x84\xfa6C\x1c<\xbb\x08e\xe4\xd5\xbd\xd2A@\r\x95\x07Nbm\x07\xdb\x08\xeeE\xd7I\xdf\xa3u\x8f\x87\xac\xb7\x98\x0c\x82\xd9?\x11\x9d\x9bp\x91\xe0x^e\x01e5\xe1P\x0e}\xf0\xae\xa6\x8eqz\xe4q4\xa0y\xfaU\xa8~\x92\xe7\xcf\x83\xabf\xd77\x15\xcb\x0e\r\xf2\xd3\xbf\xf1o\xf2\xcf\xbe\xddm\xf8\x92\x12\x10\xfe\x93\xb4"\t\xe5\xd6b\x16M\xa2\xf9[\xd6\xachv\x03\x12\x18\xc8\x17\xb4\xd4@\xee\xbcc\xde\xbf\xc5\x8e~w\xc9\x99\x83\xba\xfei\x8c\xe2M\x8a\x1c\x9c\xed5\tz\x7f\xa5\x15\xce\x98y\xaa}\x85\x9b\xc16\xcf\xbb\xdad\xe9\xff\x7f\x1c\xea\xdc\xa9I\xa7\xd0&2\\\xfaHp\x98g\xec\xea\xa4\xe0\xe6\xfb#W\xe7\xaf/\x89\xa8$7*^\x97\xc2\x88b\xe2\x1b\xb5\xfe\x91S`MHvf\xb3`Q\x7fi\x0f\xe5O\xe8;*-\x984\xab6WD\xbd\x95\xe8&\x82l\x08\x8aj\x06\xf6\x9aK\xf7Rb\x88\x12MH3\xc9/\xe6\x16\x02g\x16p\x8a\x9c\xda\xbc<\xb3\xf4\x14!\xdb\xab\x03\xa6t\xb8D\x80Wr\xe0}\x03C_`Mv//Kg5\x94\xb2m\x81\x8c0[f5%\xcc\xd3\xa9Q)\xe7\x0f\xc4k\xb3\xf2)\x95\xa9Z\x80N\x9972T\xdbs\xee\xfb\xa0\xda\x95\xe9I\xa8\xcb\xe4\x1f\t\x92>\x9d\x0e\x12Z\x89\xebe@\xf0S\x86]\x10>\x82\x93;:BblQ\xe5\xcc\xd7-M\xc85G\x9b\x7f\x1c\xdd\x11\xb5\x96\xc8\xe3\xb4R\xd1\x18\xf7\x14\xfe1\n\xda\r\xc2\xd2\x1b\x89\xec;7\xc8\xde\xa6\xabe\x84\xeb\xb8\x94\xddm\x83 \xe2\x9e\x8bx\xef\x81\x94e\xde\x02\xdc\x8b`\x9b@\xb41\xfbZ\xff\x83(\x89u\x82\x86N\xad\x92\x9a\xa3\x0c\xb6\xec8\xc4\x16\xd9Y|\x00\x9d\xe9\xad\xdfO\x9cQ\xc6\xc0l\xcb\x06\xad9\x93\xa8@\x9e\x02\xcb\x0c\xefe\x00\xc0\xe5\xb7_%v\xe2\xb1 \xd4\x00\xfej\xf5\x9a\x14;U\xf3\x036?\xa4\xaa \x16AU\xeak\xcf\x19\xb9\xff\x83\xd9B^\xa5\xe1\xb0_\xab\xef\x91g|WzMT\x0c\xdc\xe7\xe5\xfb\xd1\x83X5\xf3\x92\x01\x0f\xbb\x92{[\xa7\xe9q\xea\xc9\x92{\xbc\xe4\x9c\xe2\xabXB\xbe\x1e\x0e\x99\x9a9\x90aF)\xca~-WX\x9al\xbe\x96\xee\x17\xad\x80\x95x0\xe1@\xc8\x97\xea\xd6\xe5j\x1bH\xf9^\xfb}\x808\xf4\xa3\x12q\xdc\xbb\xd0\x9dh\xf2\xb5\xedTx\xd8\xd7\x8cg]\x85\x00\xbc\xae\xa0_v\xaf3\x8a31\x10\xed\xd4\xd4=\xd1M\xde\xa0PY\xe2\xfe~f\x89\t\xc8g\xa5\xe2N\x8b\xb9U\x99\xbc\xc7\xd3\x0cV\xbc\xa8\x9b\xe4\x88y\xbe96R!\x8eQ\xce\xff\xd0=*\x9f\xea\x1d\xd6\xef9\xf4\x94m\xfc\x94\x87\xe5\xd8\xdc\xb3\x9b\x9f`\xb9\xd7>J\x0f\x00\x9f\x7f\xfb\xf73A\x9b\xd9\xd3uM9b\xc1\x8eT\xd0\xe6;\xf9\xb7\x98\xb7\xbf\xb0\\g#\xde\x80\xde\x02\xc94\xf41\x98&\xf89\x873K\xd1\xaf\xb2\xa6\xa9g\x8fRk\xc0\x1c\xd2V\x99\x12\xfd\xc4\xecn{iM\xe1\xc0\xb3A\x82\xd8,\x1b8\xccnw\xd7\xb8\xc2\x07/\x06\x8f\x1b\x0f0K\xe9\x1c\xf1h\xc6\x8e\x98\xb7/\x00@\xef\xb73\xc3\xac\xf2CC`Z\x1c\x07\x9b\xedH_\x84 u\xc8\x83<\x83\xcb\x03\xfe\xe3\xb4\xfe4\x06X0!\r\xf5z\xa5\xfe\xd5\xe8\xc9\xcb\x93J\x7f\x9eX\xb0\xd7\x98Y\x9e\xd3:\xbc\xd6@\xe7}\xa0{\xc1\x0fk\xce\xf2\xfc\x99\x19\xe1c\x185\xa2\xa2?\xc8Fo\xa6\x7f\xe4y]C(\xbc\xff\xd5`\x89%\x98\x90/\x03\xd1\'k\xf3q\x95\x05\x16n\xdb\x91vU\xbc[\xbd2-5\x96zt\xc1o\xe4\xaaY[\xd1\xded&az\xd6=\x9b\x92X\xea\x8eD\xe7\xe1\x00\x80\x08@V,6\x91\xdb\xca\xe6\xc2\x11.\xb4\xa4\x7f\xd5L\xd3C\xf2\xfb\r\xaa~\xc0\xa6\x04\xc4\x18|%\x03\xa1\xbc\x95\xec\x1c\xbe\x9c\xde\xaf^.r\xc8\x15\\\x80\x80\'\xd5\x0b\x17u\xd3\x00\xb4\xe4~\x82\xe2\xc3\x941\xd9>\xcb>\x01\xba7h\xe5\xd3\x1cdJG\x04!?\xdb5\x1a\xd3J\x1a\x1fl1\xc58BT\xf2\xa5\xa8l\x95Hx\x08\x9f:*^>kU\xefUH\x9f\x94\xb0\xf5N\x16z\xe3\x1dn\x1a\x8aBW\xfb\xe4*py\x00\xa8\x15X-\x883\x1f=\\-\xe9\xfa)\xde5*\xb2%\xbf\x00\xf0\xce\x88\x86\xe1\x18f\x9f\xcdR[=\xfa$\x11\xf2\xe0\xac\xd7i!\x0b\xa1\xc1\xb8\xa7\xe6\xa8\'\xe8v{7\x8c-\xff\x9czr\xfdH\xbf.Q\x17\xa2\xeb\x02\xfc\xb7?\xecv\xb6\xbb\x01t\xe91\xb7)\\\xad\xb4\xb6\xb2m\xa4\xe1\xa2\x89\x0f\x00\x9ax3n\xe0![>\x00\x10^%o\x1a\xcbm\xdd\xbb\xb9\xf4\x84a\x15m\xa5"i<w\xb7\x0e\xeeN\x81E\xdc0\xb5\x8ax\x9d\x8a\xe3\r\xf7\x11.BHF\xdfp/\xb1\xff\x1c\x922\x14\x83\xb5\xe5\xf1\xa6\x93\xe5\xee\x8bry\xfe"exW#3\xbd<\x19\xe4&~\xc4\xe0\x0fZ\xe3<A<\xe45\xd6@V\xfa\x85D\xe8\xe9\x10\x8f\xd6\x88$\xbcC\x0f\xb6\xe3\xdb\xec\xca=\xc7\xe0\xa1\x1by\x9d\xf0\xcaO\xbc=\x15\xde\xb2\xb9Uo\xa73\xb2\xb2\xed\x94\xd9\xe1x\x9f\xc1\x8c\xc0dl\x91Wf\xec\xde\x00\xbb\xf0:\xb4\xa8\xd3`J\xc35\xa3H\x1c\xd1\xfc1\xdb\x0f9\xb2\x8e\xdc\xd9$\xf2\x99\x9ch\xbe\x92\x81\xbc\x9e\xc1\xb5uNr^\xab\x99s\xfd;,\x1cv\x1d\xcb\xf1HW\x9b\xea\x07\xf0\xadX\x12F\x1c\x07\xd8\x8bCG\xcaM\x99\xd2\x03\xbf\x95\xfc\x83"\xaf\xf8\xa2`\x9b\xb8:\x98\x95\xa2\xa8:\xcd\xf3\xac\xe0\x90 K\x02\x87\xf2<\xc2\xf8\xc2nH\xb6\x98\xe6\xef\xbd\x9d\xd3Q\xdc/\x05\x8c\x13\xed\x91\xf94\x99g\x0c}(\x95YmzJ\xd8\xf1\x0f\x0e\'K<\xef\xdc\xee\x14\x95\xa6M\xc6\x832\xef\x1dB\x87\xb4%\x15\xe6\xa8\x87\xccX\xe7\xeb\x9a\xb4\x1c`\x95M\xe3\xe7\xfdu\x98\xfd\x975\xf3\xa2W\xbf\x9e\x0e\x14~0\x80\x18ftE\xdfW\xc9\xe5\xcf>\xf1\xdf\xb69>\xb4g\x94k\x9d\xd6\xb9]\xad\x99\x83\x83"[g)\x86K\xb1\xe6\x83+\xfe\x85X*\x05\xe9\x04\x06B\xee\x90\xfb\xc0\xf9Y\x0bI\xf6\x06\x92e\xaa7\x85\n\x17\xc1\x8ae\xd4\xcc4,^\x14v\x84{\x1f`u\xe8\x8e>3\xe3\xa1\\)\xb7\xea*\xea\x1d\x1btf9\xf42\x00\xfda{.\x9b\xd5\xf7#R\x99srC\x9eV\x9d\xd51\xe6\xdaV\x19\xa3OH\x8f\x97!M\xdf\xc1\x98]u\x94\x7f\xea_T\xa9x\xad\xa6\x88\x0c?Z@\xa2\x92\xed\xd2o\x0e\xef \xfct\xf1Q\x03aV\x13\x00\xfb9[u\xcb\x07*\xd1\n\xc4\xa0xIcW\x8e\x1c\xcbZ\x8f*;\xc6c\xecr\xf6\xa9\x95\x7f\xf8\x08\xe6=\xd3\xc1`\x15\xd1\x19\xd1\xcb\x0b\x97 \xbd\tY\x91\x0fGv\xb2^\xea\xbf\xc4\xb0\xcb\xc8\xcb\xcd\x7f\x1eE\x85\xd9\xc4/~\x059\xe0\r\xd1\x0b\xec\r\xf9|)\xe6\x0eR\xb7\x16N\xb2;\xe9\xd4o\xa4\xf8\x96\xfd\x9du\x1c\xcb\x16\x14\xf1\xab\xe639\xbf\x97\x063z2\x93\xf33\xd5\xc7\xa0Fq\xc1\xdd\x00>\xe6S\'\x86U}\xa0\xd1\x7f\x02\xe9\xe0\x15\x88+u\x80w\xf9&$\x14\xab9i\xf2]\x93\xfcV\xe11\xa2@HL\xff\x80\xcdk@\xf6\xe6p\x1c\x97||\xae^\x06\xf3~{\x11g\xe0/\x8c\xc6\x84{\x9c{Q\xd3\xa2\x1f\xeb\x9f\x8a\xadSr\xb5\x93\xd1Qbd\x8f6\xf7\xd3Y\xd8)8\xb5\xd6\xb8\x8d\x8d\xbf&\xc2D\xa8\x0c\x14\x08\xa1sj\xc1,\xc7\x7f\xf7\xc4xG\xd8\x85b\x95\xbf\xb8\x92[\x03\xd0\x9a\xe1\xeb7\xcc&H\x0f|r\\\x999\x17!\xae\xe6\x0b\xa9\xf7i\x86P`\xc4&\x13\xf8m\xcc\x9e\xa4\xfaV\xad\xa2t@\xe2f?\x82j\x9e5\'\x81\x95\x00\xa5\x80\x1a+"<\xferD+]h_\xee\x1a+\xcb\xc9\xbb?\xffp\x83\xf4J\xde\xce\xd2Y0o\xa4`\xd7\x13M:\x83\xb1\xc9\xdd\xe1j\xc1rrf\xca\xb9\x99r\xa8\xda\x06\x83\x928m\xc2\xc95\xf9\xfbb\xf8\xe1\xd3\x9c6\r\x16\xd0\x92|%Cr\xd9\tf\xaa.V\x08[\x96\x023\xd5^|r]\x9f\r\x82\xa4\n{>\x94q\xe9\xc0\xfc\xd5^w\x8bdwqS~\xe75D\xcc\xe2\xcc\xc9\x9f\xa69&Fm\xf7\xca\xf2O\xf8V\x81\xa5p\xa5\xe5L\x84\xdd\xff\xe2\xec\xf3\xcc\x92\xf9\x1f\x86\xedT\xfc\x8e\xe1\xaa0\x83\x8f\x99 \r\x08\x8d\xfd\x9b\xdeS\x8ft\xa7F \x89l\xe1\xb3\xb9\xd9H\xac;E\x83\xbb,\x97][\x04b\x90"a\x96\xf9\x84\r\xef\x94\xae\xb6\xc7\xa5<\xf6\xca\x89\xa3gc\x95\xde\x9e\xcf\x1dN\xd9\xa8I|\xfb\x80\xf1E\xc8\xa1\xb1\x86\xc8ps\x1c\xee\xc5?I>^{\x8b\xf1U\x81\x90\xf3\xf2\x16\xbf\x1f\xde\x88\x9d_37|.A\x1c\xf6\x00\x9e\xf3mI\xc6\x95\x98\xb1m\xa7j\xb5\xec\x1d\xa2\rU\x94L\xc6|\x9e\xbfkP!\x94\xdd\xc0\x0b\xb1\x9aCY\x8c\xcb\x8el37\x93!\x1c7k\t#\x1eg\xd5\xed@\xcc?F\xde_\xdf\xaax<g\x85\xc4v\xef\xb1\x04\x93\xb1\xd5/\x08\xcb\xe98P7\xe3\x84\xe5.\x87\x8b\x1a\xbb\xca\x8d\x13\xedh\xd09\x0b\x95@\xba\x95&\xfb\xdb!5Ypg\x08\xaa\xc6\x1c\xb9coj\x91\x84\x99\xb0\xday\xd0\xa1\xe2\xc5\xb5\n\n\x1f} \x8d\xe1u\xf3\xd8;\x14\x8f\x9a\xd3`\xb1\x8b\xb1h?\xa6y\xb10-\xe4\xa5UG%Dg\xcco\x9d^u\x01\xbb\xcb\x9f\x10>\xc7\xf3\xb7\x92r8\xe7\xdd\x83O\xd0!\xefX\x1a-\xff:I\xf6XF\xfaL\x90\xea^}i\xf8\x98F\x1e\xd3l\x90\x9b\x9a\xaa\xb4*@\xc8\xbe\x055\x11\xe3\x19\xfa\x82\xf7\xd6\xda\x02S9\xe4\t\xa6\xf7\x0f\xec\xee{[\xbf\xf3\xa5\xc4:\xbbU\xdf0T\xd0v\xbe\xe3\x03I\xb9\x8az\xf3\xb1\xd2\x1e\xa5\xc3\x03\x87\xa1\x17\x031\x90/a\xbaPV\xcd\xaf<\x10I\xfe\xfd\x19\xf4\xbd\x9a\x08\xd3A\xa4\xcbE\x0c\xe0\x8e\xbdp\'p\x0c\xf6\xdac\xaa\xaa\xb4\x02\xdb\xfe4\xe75\x14\xf8L@\x1c\x15\x1e\xb5P\xaf\xc7{r\xc8<\x9eB\xf1\x98fB\x91\xb1\xaf&q\xc6\xb4\xe9\xb9=,\xf0X\xb0\x8bC\xfc\x07o\xef\x87\xc7Q\x83\x0f\xfc\xd4\xb9\xaa(\xc4MWE\x98\xb2\xd9\x9cbdnoW\x80\xb9w\x10\x82\x8e6\xcbc\x9a\x1b2?\x80\xdc\xc8;\x954A\x97Gq\xee\x7fj%\xed\xe9\xa6\xa3\xf9\x86\x82J\x00\xee\x06\x1fr\x89\x1e\xbd\x07rJ\xbf\x00\x95\x8a@O.\xa8\xc0!\xfd\x9c\xec\xa1L\x03\x0f_\xa4\xcd~\xab\xdaUp\x1e\xa7\x8c\x97\xa8X\xfc\xda\x12\xc5Hj0\x9f\x1f\x96\x17\xf3\xd8\x8a\x9bI?\x8c\xfa\x07\xb5(\x02\x11!\xe2C\xb1\xbc\x1d\xac\x83\xf3\x10m\xf8N\x8b|\x92\x1e\xa2\xd3\xfdt\xad\x1a\x83m\xf5\xa5f|\xdfe\xb3\xb7\xdaK\xf0\xde<\xcf\xfc\x8b\xac\xc8\xd7\xb8\xff\x85\xcc\x1e1+\xad1\xc2\xe7\x9c\xc4\x91\xa3\x83\x038G\x8f\xd2%{M\xa2\x18e\x1d\xe0f\x93\x0f\x97\xa3\xde\x08\xa8u@{\x0b,\xc3\x96\xacv\xc4\x95\xdf\x11\xf9\xf8\x92\x19\xc3\xdb\xbc\x8a\x89a@\xfc\xca-\xdb\x8e\xe79\x03\xdd\xe8v\xc4\x0b9Y\x16M\xc3N\x0bto\x11L\xf4_]\x0e\x86\xba\xcc\xc9Z\xb5\xfa\xa6\x81\xc3^$\xe6.\x04\xf8\xf1\x90^\xe9\xa7I\x9ai\x8b\xc6\xbc\x08]}\xfd\x83\xbe,)\x81\x96hQ\xcf\xec-r\xe8\xe0\xf1\x9a\xfc\x19dV\x9a\x9b\x8c\xbf\xd0F\xe8?\xd7!\x80/\xee0\x96\x18i\x02\xb0\x98\xaa\xbdK\x0bP/\xac\xb5r8#1\xd6\xbd0\x99Q\x88\xe4\xf0M\xd9\x8a\xbb\x12\x04\xc1\xc2z\x9b\x0eC\x96"$\xa0\xe4\x9b\xc0\\\xbc\x81\xa7iI\x13\x10\xe0I\xc6$\xa4[^\xa6b\x19%\xf6\xab\x88\xcd\xc3\x03\x0e{&\xbf\xd8u?\xcdS\x9e.\x071=\x84\xb7\x8a\xa6\x068\x93\x89\xcf\xa7F3\xc5\x9d\x17\xee>\xaf\x9eo\xc8\xb9Ak\x83\\\xcfQ\xf0o\x8a\xcf)\x1b\xe8\xb2\xab\xeb3wg\xbaW\xf9\xa5/3\xe4\x92\xc7Rz\tHn0\x1a\x02\tN\xf7\x9c=\x8f\xb3\xe1\xdfxR\x82O^\x1fK\xbc6\xdbO\x1a\xb8e\x12\x97\xa0\xf0\x18#a`\t68Ka\xc7\xe2\x8c\xca\xdd_\xe64\x9f\x7f\xb3\xad\xe6=e\xfd\r\x16\xfev?\x1f\x01DA\xba\xedG; \xa6vZR\xd1\x02\xbeG\x89\xd9\xfa\xb5\xf7\xec\x99c\xc7`\x9a\xab\x10\x07}\x9f\x97R\x13\xa2\xe2\xa7\xdd\xc69o\xb1A\xa2\n2/3\xc1Q>?\xdcw\xe1Ue-\xa4\x1d\x0c\xd7A\xc0E\x89M\xa0\x12\x8bNY\x18R\xf3\xd0\x96i/\xbd\x0e\xa5\xa9n\xe5:\xd4\xff\x10;\xea\x91\x1ae\xd2k\xe8\x8c1?Y\xf0\xf32\x013Ae\x94Pp\xbc\xabM\xd4s\x86\x8e0\xef\x8e\x08Y\x81\x14-\x81HD9l\xfc\x8d\x10\x9e\xe0Wl!\x9f\xbe\xc0\x83\x16\xde\x04\x9e\r\x8c\xd7\x04\x14\xaeAr$\x91\xcb\x13u\xf3\x11\x8d\xacq\xe4=}\x1f\xad\xd9\xa7sr\x81?\x05\xc0P\xc8\'\x116\xb8\x93\x08\x9bI\xa1\x11\xba\x1b\xde\xd4,\x96T\xcd\x8b\x18\xf1G\xfbT47\x98\xc8mp\x9aeV\x08Y2\x01\x83\xaf\xbdf\xbcs\x89\xc6h\xfd\xef1\x8d\xa8\xc8\xfa\xaf\xe2j7w\x90\xe1\x81y\x04\x1e\xc4\xb6\xf6\xe8Fvvb\xdc\xb4\x02<\xb8\xba\xfe\xc1\x19?\x04<h\xd4\xc0\xe7\x93\xf2\xa4l\xae2XEG\x1e"\x13\x1f\x97q\x82\x0c\x80@\x9df\xf7\x1do\x8e~\xf9u&Vn\xfc\xf7v\x86hu\\\xea\xf9\xe2\x80\xecy\xaf\x8a\xd8\xf3\xd7+\x15\x7f\xc0r\x0cr\xb8\x18\xa2\xe7\xf1\x9epLzH\x1aO\x1a\x12u9\x8a!n\xd6\xcc\x15\xa7\xc2=\xfa\xfdy\x1c\xdc\xdb\xe4\x0b\xdb\xb5O\x7f\xd8n\xd4\xa8\xf1U$\xb40\xfe\xe9\x00\x92 \'\x02\x17\xa1\xba\xb8\x86\x97W\xfb\xd1\xa0\xd4\xc6\x81\xab\x9d\x05\xb1\x14t\xfe\xb7Q\x1aU}\x01\x83\x99\x18o\xef\x94\x8c\x93\xea\xbd\xf1\x99\xd3\x972\xf3\x15\x1e\x83\xdd\xe1\xabBS\xac$\xf0\xae\xf8z\xbad(f\x86\x94\x10;\xca\x81@N\xa2\xe1.(\xb4`\x87\xf0\x905\xa1\xc4\xdc\x0b\x89\xea,\x9cR\xbd\xf0\x10\xce\x00\x16\xc1,I\xefP\xcf\xfe\x01\x16dm\x9a\x13\x83\x84J\xa7\x7f,\x90\xda\x90\x81\xd3\x16^h1\xee\x0f\xb6\x80\xfa\x1e\xef7\x9c.M\xe1l*\\rb\xee\x97\xc5;rlq\xf5\xddv\xed\xed\x08{\xd9\xceh}\xe2\x1e\xd9\xe0\x12\x86\xa0%+\x19\n\x91\r{\x14Z\xc9\xfb\xe4\xb6#\xdf\x13\xc9*\xba5\x98\xcdJ\xfe!\x85\x90\xcbk\xe5Y\xe0\xf5\x83a\x05\x05!ru\x8b\xad\xc6gn\xa5Oa\xa5\x88\xfc#Q\xe6\x8bU\xcb\r\xb0\xbe\xd9\xb2\x91n=~\x1a\xd1\\\xf5\xc1\x8fH-x\x02\x1e\x00V1\xc2\xe0\xd6\x99\xcd\xd6\xc3{_vVk]\x19X\xc1E\xa0\x99T\x04&\x0b^\xe5\xee)R\x90\x1bF\x80\x0b\x9aypw\x85 \x8a\xde\x1f\x9b\x93)\xb1\xbcr7\x1e\xca\x81\x14\xe7\x08M\xa8\xac\x93\xd0!\xcb\x9f1T\xe2\x85]g\xc0xn\xe8\xa9\xc6>\xaa\xed\xf7\xaf\x10\xe2\xe3\xb0\xdew\x7f\xa0T}\x05G\xff9-\x95\xc6\x1aOh\xe6\xdf\r\xf8\xb8\xe4xe\x83O\x96g\x81\x93\x17\x0c\xf9\xda^0\xf9i\x0fb\xdb6\\[\xae\xf7R\rn\xff<\xe6\x92\xc2)gp,\xb2x\xec\x16\xe6\xdd&\xd1+\xad\x17\xa3x\x15t\xb2\xf9{y\x97{\xa7L\x8d\xc3%8\xde\xb5}\xf9\xb7\xd1\x19\t\xcc\x19\xa2qB\'\xd1\x9bL+\xa1\x9bG\xce\x84\x00\xd5\xbf\xda\xde\xcfv^\xbf\x8bYmbt\x92\x9c("\x9b\xc6h\xa4\xc4A&\xff(\xa2I<\x83\xee\xac=\xd8\x07\xa2\x07\xe2\x98\xc2\xc3\xf6\xda\xef$\xdfV\xd7\xd7\x81=\xc1\xeei\xf5\x10ev\xa7Sp\x82\xdb\x9dl\xa0\xdb$\x92\\s\xe1#\xdd\xc2B\x89\x81\xf8Vh\xd3\x13\x8a\xc4B\x06\x13\x8bk\x8e\xdc&\xba\xa7yI]\xbfQ3\xaf\xefR\t\xc1\xd3\xb6\xb1W6\x83TL\x16\xfb\n\x1a\xdf/\xe7\r\xb8\xf6H\x9a"<I\x9a-\xa63\xdb\xcd\x94\x9aW\xf8\xc2\x8e\x13\xe3\x85\x82\xc0 Hy\r%`AT\xf1\xeb\xd0\x84\x84\x8coR\xcbm\xf5\xedr\xc2\x05\xd4\xff\xd5\xbbU\xe4Q\xba\xd0\xf5\xf8\xfb\x84\x8a?\xa9\x02\xfd\x1b\xe9\xcd\x9e\xcde\xc1\xde\xddH\xf8\xc5\xeb-a\xbdFh\x93\xc9\x9cK\x91\xef\xdc\xc5\x8e\xdb\xd8\xa7\x9eW[\xc7s\xe0q"\x97\r\x88\x12\x7f2^\xb5\xb9\xf5?\xf9\x92f\x03i\xd2\x9f\xc6\xea\x80\xeb\xab\x8a\xce7\xc5\xfa\xa1\xad\x8a\xf7s!\x9c\xd6\xbey\xa5\xd3\x9eBEx\\\xf12\xba\xbaJ\xd9H\xbe\xacM\x84fH\xaf}\xe0\x97\x8e\x15a\xbf7/\x99(\x929\xf6\xab\xf3=\x15y\x03\x01\xdb\xa5Q\x1f\xd8\xff&\xd4e\x98\xf5\x86\xfdDu\xd5!\x10\xc9\xb4\xaez\xc2\xd8\x86\xaeU^\x82\x00:\xdc%\x0e!\xbc\xe4\x8d?\x14/\x9ek\xab{qky\xf1\x17\xf7\x16\x1b\xf0\xafMC\xa9^\x8cm\xbf\x18\x8e;\x16\xb0l\xe0J,E\xd68\x99\x92\xf2\xb8l\x8d\xbd\x7f\xea\xb47\xb51~\x1fZr\xb3|\xfa\xef\xa0z\xc3\x82\x1d}\xb1I\xb6O\x90\x9cm(\x8b`\x02\x97\xd4\xc9\x08K\xb0\x07\x8c}\r?\xed\x96\xa9y\x93\xc2\x82V\x14>V$\r3\xc9W[&\xebQ\xb2\xffn\x0f\xf1\xf0\xbbrS\xd1v{\xd5t\xb0\'l8\xac\xe4)s[\x06\x90\xc8\x9f\x94\xc1\xbf3\x08\x97\xb5\xbe\xcb=X\xa0,\xc9\xc7@f\x03\x13\x95\x8e:g\xb5G\xbe\x89}\xbb\x84\x856\xad\x03/L\x8a\xe8\x10\x83\xdc\x15O\xc8\xf0M\xc2pM\x00\xe4\xa2\xf0\xaa\x0b5V\r\xef:a"\xfa$a\x1f;\xc5\x8f\xca\xc72$\xbaJ\xf8e\xaa\xf7"\xf5\xb4f\x8d\x06n\xd3\xe8\x98\x8fa\x17\x1e\xe7\x10I\xd0\xca\x1a\xfb\x91\xd4\xb4\x98\xb2\x8eFw\x9a$\x90\x0f\x1c\x15\x9f\xde\x97\xff\xc3\x00\xc2{\xeb\xb5\xce8\xb4\xf5\x0b\xf3\xf7\xbd\xbf\xc3u\xc1\xfe\x8fc3\tu\x99t2\x0e\x8a\rBN\xd8TvMTR\x03\x8a\x000[\xc9\x04\x1eUC\x93&\xbb\xf9\x1d\xe94G\xdb\xb0S\xf9\xa0c\x85)\x97c\xe8\\1\x14\x96eJ\xfe\xa9\xc4\xdax\r/.\xa1\xae\xa2\n\x9b8C\x17\xd21\xa6Sa\x8f\x11a\x08\x0b\x86\xb0E8C\x02\xa2\x0bn\xcb\xdfgO\xb6\xc5\xd0\xe0\xb3\x82T0\xe2Y\xecS\x00\xb3\x17\xce\x16\xc3\x8e\x9cD\x10\xe2\x88r\x94\xa7c\xd9\xd7J\x1d\x92\x19<\xcb\xf9D!\xb8c\xa6\xbf0\xe7\x8b\x94\x9e\xb9nH\xa7\xca\x11\xbbe\xe9 \xe0\xfb\x0e\xb6y\xf2.\x1e,\xe7j\x8b\xc8\x00\xc1|oS\xafn\xc6\xb1b\xde\x1aP4\x03TB\xf4\xd2\xf4\x9b\xea\xfc?\xe0\xda\xea\xf5A5\x10\xbf,D\xa2\xea`\x97\xe2\x13[\xfdJ\xaf\xce$\xad\xab.R\xbe\xe8}o\x9e`$\xba\xdb\xbd\x1dV\x83\xf9T%\xe5\x89\xec\xfc\xfe\x81\xf6\xef\x08\nN\xccb\xcdd\xd6\x80h\x81\x8e\xc4\xc9\xff~\xe1Sj\x89I\xbf\xcd\xf2_\x1a\xf0\t\xafT\xd1\xf7\xf4\xe5\x1c\x1a\x92\x7f\x8d\x06}\xc3\xfc\xfeN!cQW:P#\xd8A\xfa\xae\xb2\x8b\x8cUg\xd2\x1b8\xe4\x9b<\x1c\xd6\xe0\xe2\xd0\x95\xa3^\xd4Yw\xfa\x0e\xbb\xbe\xba\x03\xc8\xf1\r\x1c\x8e\xcb\x9c\xd7\x98}x\x074\x16\xeeY\xa0\xb54>G2;\x84\xe0\x08\xe5j\x7f\x02\x92\xfe\xda\xbaWco\xe1\xe6\x8d\xc9\xab\x06myyY\xb1i\x01_Q\x01\x89U\xf6\xc5;\x0e\x82w \x84\xc5\x86L\xaao{\x96SD\xfa\x00\xa19m:\xd8\xee\xee\xffjkQ\r\xb2\x88\x90\x8b\xc2^\xa4\xa5\xaa\xa7\xc9\xb6D\xe6\xc6@\xa8!\xa2\x88o\xc2y\xba\xc1\\#\x15\x99\xfa\xd2):\xe7b\xbe\xb4\xb2~*3\xb8 \nu.\xeaV\x16\x82\xabo\x97\xd9\x915\xbes4\x8b\xa8\xdc\xb1B\x91`\x93\x91~Uf\x86\x10Q`}\n\x81\xe8K\x82A\x80\\\xb7:\xd2\xb9\x00Rq[0\xd9{lG\x94\xc5o\xef\x8c\xdd\xb1\x17\x05\xee\xfa\xe6oY\x85\xf8\x88\xf8\xb5\x04x3Hc\x92}Q\xb8\n\xf2\x07\xad\xc7\xd2u\xcf\xf2M\xcb\xb2\xc2\x8f6 Jeudl\xea]Tp!Q\xb6;\xdb\xb7\xa8\x0b\xeb\xe1\x99%\xf8\xe2\xfc\x7fr\xb8]SY\xad5A\xd9D\xcd\x85j\x97 QI2\r5\x14d\x07H\xc5\xfcW!\x94l\x1fF\x98\x07\x80\xa9\x1dk\xeeV\xea\x87\xf6\x9f\x82gv\x08\t\x88\xa1(z\x0es*\xffV\xb2F-\x0c\xb4\x83\x9a\xdbf\x11\x1ek\x18\xbaguB\xef\x98\x01\x9c\x0e\xaf\xa9!i1\xcap\xa5\xaa\xca\x93\x9fX~\xf5\x97\xb7\xba\xa1\x0f\xe2ho\xf3\x83\xf2\xe0\xe3\x94Y\x10Z=\x13\xdef\xad\x97\xa9+\xca\x12e\x95\x0e\xe1\xc3~f\xf7i\x82;_\xed\xfa\xf6\xbb\x96\xf5\xe1\xf2_aR3\x19]\xacPj\xcc\xc4\x81V\xdf\xa1\xbf&!\xe8`\x88}\xbd\xd1\xfa\xe4kx\xac\xe3\xbf\xfe\x05l\x99r\x88$\xae\x9a\x16\x06\xad\xb4\x00\x87\xf3\xe7\xe4\xdc7\xfc\x1bLs\x16\x87r\xc0\x80\xa6\x96\x16Z\xb2Me4\xce\x16\x9c\xc9\xa4c\xbd\x97\xd9\x01\xc1\xf1F\x88Y\xb6\xab2L\x0f\xf8\x7f\x06\x8bT\xb6.\xd7\xe7\xf6l3\xe1\x18+\x94\xba\xfb\x0f=q$3\x89\x83eY\x93\x11\x10\x8dF\x96\x04\x96\xa7L\x04\x87M]\xe22\x08\xbd\x8d\x87\xed\xe7\xd8\x95\x15\x90\xe6\xe8y\'C\xb9\xe1\xaa\xd2\xa0\xa4DLA\x00\x8a\xee\xef5\x06z\x8f\x809\xc6#\xdf(\xf8#\xc3\xca@\xe0\n\x9bd\x06\x89\xd2\xd1D\xad\x0b\xd9\x9b\xadRYJ\x9e\x8e\xb0H5>c\xf4\xf5S\x94\x1d\xef+OQL[\xba9\xc2\x83\xe3\x0bN#\xe0aF\xf2Iz\xcc\x16\x93\\\x9d\xac\xfc\xe6e\xa0n3f\xce=\xca\xc9\x81\x80\xbf\x1a\x8cX6gc\xbf\xd1\xe1\x7f\xdbn\xd1\xe0\x9f\x84\x14\xbd*5^\xa7wEM\xdc\xf7A7\xbfz\n')
server.py ADDED
The diff for this file is too large to render. See raw diff
 
smoe.py ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ # Pyarmor 9.1.8 (trial), 000000, non-profits, 2025-09-12T14:31:07.823054
2
+ from pyarmor_runtime_000000 import __pyarmor__
3
+ __pyarmor__(__name__, __file__, b'PY000000\x00\x03\n\x00o\r\r\n\x80\x00\x01\x00\x08\x00\x00\x00\x04\x00\x00\x00@\x00\x00\x00wA\x00\x00\x12\t\x04\x00;a\xea\xb6\xeen\r\xe6\xfa\xe3_\xb9\x9a9U\xde\x00\x00\x00\x00\x00\x00\x00\x003\xf5\x87\xbds\x7fJ\x11\x8c\xa3\xff\xf1-\x17\xc1\xa0\xcb\x80\x83w"DT\xce\x90\x99B\xa9\x82\r\xc1IT$\x0b\xff\x14\xaa\xcb\x82\x1e\x99\\\x1f\xab_\xf2\xb8\xa2\xea\xd4\x8e\xb2\xdb\xa4\xa0L\x81\xdbs\x88\x92\xd8\xd0p\xfb\xd8\xaf\xf5\x8e,^F4\xb4\x86D$\x19\xec3\xf4/\xf3\xb4<#!}\xde\xb2\x8d\xe2\xed\xa8}\x1c\x96P\xaa\x1d\xd6\xe3p\x96V\x98\x05\x1f\xed\x19pGa\x94\x81\xac;K\xf9\xc2\r\x91\x04\x95=\x8d\xb6\xed\x98V\xea\xca\xda_d\x9fH\x9f-\x9e\xa9\xa6\xe7\xac\x01\xe0\xb7\xb7}\nD\x05\xaf\x85EhU\xe6HK\x84\xae\xb7N\'\xd1\xf9\xc4\x9f\xca\xd1x]\xc0.~\xf4\x91\xa2\xf1X\xfe\x99\xbe\xdc\xfbX\x82\xebk\t\x9b>}\x14\x91\x81\xdb\x19J\x01\xba9\xea\xcb\x81:\xc85\xacw\x89\x9c\xbd\xf8\x1dU\x17\xe1\x81\x15\x1e\xfaX\xed\x18\xe3\xebR\x1e6\x1d\'\xaf\xc6\x9e\xf6p\x86\xce\xda\xb8\x05\xbfd\x08`\xb2\xfb\\\x11\xeax\x9a\xa4(\xa4\x83\xb2Y\xff\xb1\xad\xc2\xf1\x14\xb8\xf0\xd5\x14\xeb\xee\xf5\xb9\x91\n\x04v\xae\x10\xd2\xbc\xe4\x83~\xf7I\x08\x1f\xeaJ\xc2\xc2\x06\xc0\xde\t-\xf2\xae\xcbTvA\x94"\xa7U\xfcL(f \x04@i\xe1\x80\xd78\x11gW)\x0c\xd6 \xa0\x1e]\xab\xde\xd8\xc6\x10:\x80\xa48\x1c\n\x8eW\x03\xa2\x8c|M\xe5E?\x1e\xbd\xf0\xfal\xf5\xbeH\x90:\x03\xd6\x1d\xde,\xe6\x94\x90$\nM\r\xae\x1f\xd6\x05\x85(\x01\x87y4\x153]Ak\xf9+s%\xa0\xb6\xdc\x82\xf3\x1a\x86\x1c\x01\xd8\xff\xb1\xd9\x00\xfe6\x87\xa8\x81v\xddv\xc7k^m\x8b\x8f#\xe6\xd3\xd1z\xa7 \x02\xe0\xe0\x02@\x9b\x8aX\xe1C\x82\x1e\xfe\xc1C\xb8\xd6\x08*E\xfe\x12/\xcd\x8c\xc7&\xa0\xe1V#\x88\xe7\xd6\xeb\x0e\x0657\xbc\xd8\xc8\xa4=P\xd6o\x940>\xd39\xd6\xf9\x1cD4\xd4\x15\xb6\x13\x19\x02D*?\xc0\x8f\xfc\x0eF-\x97\'\xed\n4\x83)\xb6y\xbf\xa6\x9e+\xa4[S;3c\x10d\xfc\x10\xc28\xb1\x14V\xb1\xf3\xd7\xe8\xcf`\xbe\x0c1\xf2\x15\xd6\xd22\xdbU\xd3\xbeb\xc7~\xcb\x1dE\x0f\xaf\x91\xe58\x0f\x1a\xeeD\xcdv\x8a\xb2\xab!\xb5\xa7x\x91Y@\xfb~\xf2\x816\xc3\xe1\xf4,\xbf\x12*\xde\xd3&\xe2\xcc:\x91\xe3\xc1\xdeE\x18\x91K\xce\x04\xfas\xa4j\x80_\xfbc\x13\xb6\x88\xc9\x83\x9c\x0e\xda\x82`.:\x13I\x00\xc7\xfe\xa9\x0c\x1e\xd9x["\xf6c\xb86\x12\xba\x9f\xe0Y\xd6\xfb\x89\x83I\x9b-"\x82V\x01.\xf1\xad-;8ce \x86\x18#!\xadS\xae\xbe\xf8\xf6Ip\xf3\x98|\xe3<\xf0\xe2h\x06\x8e\x03@\xa6+\x1e\x1b\xc10$\x02\xd7\xb2H\xc8\x0fE\x91x\xbeV\xf6x\xac~\x1b\xcb\x9d;\xbaM\xcd\x08\x9bo\rWc6\xdb\x98\xbf!,\xeb}\xe4\xca\xe8\xb3\xb1\x1f\x91\x95G:\xb3\x06O\x9d\xe15\xb3:D\xde\xf9\xca\xf8R\xcb\xe4*bS\xd3$\xfc\x95nJ\xe3t\x0c\xc1=\xd8\xd6H\xfa\xe0\xf5\x15\xe0ri\x1a\xaa6\x94,m\xe20\xb7a\x024\xdb\xe1\x08C\xe8\x11_\xb0\xb6\xc7\xd1\xfd:@\x1c\t\xc9\xf6\xdc\x7f\x05\xf7\x16\x96\x08\x1f\x9f?\x1f\x88-\x81\xd8\x85\xc3\xf9\x9c;\xb1\x0e\xe6\xc4v\x88\xe6wF\xcc\xd7\'7\xb2\xaek\'\x97\xd3\x85\xa8~`\xbe\xb4$\xab6>\xf8\xa6y\xfc\xd6eM\x16\xcb\x0e\x05\x10\x91"E\xc6`\xba\xe7\x98w\x86\xbc\xb1\x87\xa3 T\xcc\x91\xc8j\xcc\x7f\xfc\xf7\xf3\x00!\x04\xb4\x16Sy\xed\xa6X\xb1y\x1b\xa3\xf5\x16D{\x06\x01j\xa0\xfb?43\xd8\xc4<%j\xa0H3\xe2\x16^;\xc1\xbaF\x84\xba\x8d\x87\xc34\x08b\x8eA\xc5\xc1=\xe9\xe9\xecOG\xb5\x16\x05\x1a\xca\xbd\xae\xfe(\xdb\xb5:&&\xb4r\xc0=\x11^\xce\r\x8as\xe8\xf3\xb5\xae\xc8\xe6\xc4\x9cH5\x0c\x88\r4\x10ZWK`\n\xd8\xe7\xcc4\xb0\xfe\xb5\x12f \xc7u\xcc-!\xbd#\xe0\xe1\x1a\xa5\x9e\xebm\xc1j4J\xcfm\x98@\x9f\xb2\x80\x97t\x12Uq\'\x9d\xa0\x92k\x8c\xb2x^\xc6\rP\x82\xaa\x15w\x8f\x07\x81\xf4T\x94\xab\xff\x02\xe9\xcd\xec;\xfb\xe7N=\xf6\xd1\xa1\xd6A\xe6\x97\xfdm%\xfe\xe9\x9d\xef\x8bD\xf7\xa7\xe1 \xc0y\xf4A+\xde\r,\x82\xbd\xd0U\x1c\xae\x1a\xfd\xf7\x1e-\x05\x8bXD\xe1\xeb\xf9B\x07U\x82\xe9b.c\x07dq\xe6{z*#\t\xe9a\xcbU\x98$%\xcd\xd9\x15b\xc9x\x97}\xfc.p\x14\xe5]\x18(m\xb4\xe8\x7f\xce\xde\xae\xb5)\x8a\xfel\xd2\x1a\xcd\xec\x00\xbacu\xd0\x14\xb30Aq8\x8a\xaf\x9a]\xaa\x1e\xff"\xbf\xf2\x07}|\xa3\xa7\xd6oq[\xb9i\x1fm\xf8\xe0\x85\x1cv\xec\x00\x02\xe2\x9f\xc6\xea\x1e\xac\xca\xe0\xc1\x8f\xd1wD\x83\xec\x95\x8d\xf1$\xa4\xa1N\xd0\xcc\xe4\x0baa\xc0\xa9\xed\xc5\x03\xcc9\xde\xd4N\xd1\xec\x82a\xc7\x90e\x1cz\x89{\x05\xef\xde>\x84\x0b\x10\xcc^V\xad\x00\xc5rl\x88\x0f&\x17\x80\xd55\xd3\x83\x9b\x07\x99n\x1a\xe3\xbe4\x9a\x17\xf0S\xe099\xc6G\xe4B\x00sTa)\x19\xdd\x82>C\xcf\x96\x08\x07\xf1B\xb3\xc0I\x9c\x16{\xef` &-5\xc1\x81\x17\xae\x92\xb5o\x10"\xfa\xe8i\x1e\xa3]I\xa1\x17\xf1\x85y\xe3!\xdc\xf5g\xa97\xd9i\xa7\x87\xb4\xe2\x15\xc2\x0e0\x9e\\|\xaf\xcc\x8fv\xe9<\xcb\xee\x0bY\xc7_\xf4\xbf\xcf\x1e\xdcJG\x8f\x15\xc07\xb4\xe7\xce\xe6BJ\xea;\xa3\x8c\xa3\x9b\x1b\x8d\'\xc6\xef\xc2\xc4\xc2\xd2\x10U \x8f8y\xe4K\xaa\xe2H\xfe\x99&a\xd9\xde8\xf3\'\x19\x0f\xf83\x0f\xd7\xb4-\xe0\x1a\xbf\x83]\xcc*4\xa1\xf9|\x97!T\xde\xb2.\x9b\x8a\xb2\xf0\x95\x03\xea\t\x80\xa6\xda_~\xd8)F\xb3\x90|\xa5\xb3c\xbe\x9eB\xb0\x8bI\xe03\x8d\x86\xd7\xf5\xbf\xc7\xd4\x83\x92\xdcj\x0bgOy\xcc\xba\xc4\x10\xac\xf43\xfa\xdd\xaf\xbc\x95\x85\xben\xdc\xbe\xf4\xb9c\xdc\x12Ff\xe7S0\xc2\x8b?d\x085~\x17\xb5Z\x16\xc3\xea\x938\x88\xc4\x84\xf0K\xf5O|\xbf\xcfn\x8c\xf9;\x8bD\r-\xea\x1f9\xe8\x0e\x9c\xba\xa8\xc7\xdeP\xd6\xa5p\xf7\x94\xe5c\x88\xdb\xa3\xb3\xcfC\x8f<a\x8bfTrb\x1c\xa5\x1b\xb0\x80`\xa9-\x81p\xf0\xcd#\x1eG\x87\xfd\x8b\xccT \xd3Y\xff\xf6\xd24;^\x9a\xab\xff\x95\xdc:8OZf\\\xd0;\xc6\xaf\xbd\xe3\xcd\x93\xfdxg4K\t\xb1\xd8By\xf0\xee\xb4\xcc:\x02,\xa8\xce\xd1\xedpbW\x1c[\xa3\xee\xd9\xd3\x8f\x84u\xac+&0\xa7\t$\xe5QBq$\xc3LR\xd5\xdf\xe0\xec4\x05\xaen\x7f\x17\xbe\xd7I\x07\xd0\x9b\x0e\xb7\x8f\xbb\xf8i\xd2\x9b\x0e\x8d\xb9\xb2\x96_\xa8A\xdc\xaf\x80\xb8aL\x88j \xe7\xaapJp\xb8w\xc4\x94"|\xadx\xa9[\xc8\xc7\x81\xc8\x07\x859\x8e\xe9\xdel\xb63\x81\x96\x11\x1d+.\xd9\x06<\x94\x1a\xeex~\nL\xf15\xafdO\xf4s\x11b\xc2>\xa4\xcc\xd8\x11\xd4\xde\xfd\x82\x00\x84\x88p\x85\xd81\xd6\x0b6w$\x12\xc0\x10\xcdkl$\xd8[\xa0!s\xc4\xc1\x1e7\x88e\x8c\xc9WCi\x00\xff\xa1*\xdb&\\#\xe9\xbb\xc6\xe577\xaa\xa9K\xcb\x81Z.\xa5\xaf\x16>\x83{\xc2\x86\xa3\xf1&\x0bq\x02Vj\x18\r^():\xb5\xa4d&\xc8\x1b\xaa(\x12\xb4q\xc8\xf9\x91\x1d\x8c\xb3-YK\x85\xabp\xd1]\xaa\x17|?\x8a}d+ \xd4\xb7\x8a6\x1f\xb4\xb8\x92\x9a\xf8?g\xa2F\x95\x82\xd0[\xcc\x9d\xfa\xf8!\x8d\x85\t\xf9Q\xd0\x9ch\xb0!_cU\x95R\xbd\x8e~\x0cew\xa01\xff\x97\xeb\x02\x8a\x8e_\xfb__qaeu,\xb0\x90\xa0\x85vK(HL\x08\xa42\x14\x07\x97^\t\x98\xbf4* \xc1p\xf4b\xd9~\xd3$\xfc\xc7A\x8a\\\r\xc6m\xf2j\xe6\\\x80QD\xfc\x01\x87\x95\xe9\xc4\xa7\xe3b+5\tB\x94T|s\xe5\x15\xd8]t\x852\xb6`o\xc0]\r\xfd\xde\xb7\x158\xf0\xc2L\x80Is\x8f\xed=\xc0\xcb\xc5\xb5\xd1\xa5\x8aQ\xef\xaf\xfc\xa0\xd7z\x7f\xb2]\x86\x8a\x85\x01\xd2\xdf~\x80}lw\xd2B\x01\xdd\x9f\x81\x97\xb3\x94\xf7yD8|\x9a\xa6\xd8\x97\x1dT\x91\xd0?)?\xb9]/\xcaY\xd4\x93\xe9\x83rD\x80\xf9\xb0}\xb0[\xcc?\xb7\xeb\\0\xd9H\xac_\xe5z\x9dW\xdf\xb2\xd0\xdf\x97\xd3\xc9\x8b\xaa1d\xb6\xd8\xb8\xcbO^$\x18\x0b\x1f\x10\x9e\x08\xe9!\xd6\xb4\x1cp\x9df\x8b\x8f\xf73\xf9\xcd\x01\xd4v\xb2\xe3D\xf1\xa7dv\\\x9e\xdas\xd34V\xc7\xf7\x15%\xef\xbe\xdb\x1c\x83:\xd4\x92K\xb0\xe5R\x84\xf26FP\xbd\x13L\xdf\x13n\t\xa0\xe7wo\xd7\xd4<C\xaai>f\x9eF\x10\x05r\xf5\xf1\x01\xf0\x83\xcb\'P\'\x90\x99"\xbdb+\x1f\xd6\xde\x12ZM\x806\xf6(\x817\xafB>\xb0Yq\t\xdd\xb3\xfd6\x9bD\xf0`\x96\xaf\xf2\xbc\xcd^\xda\x9b\xaaa\xc2VT\xcc\x12k\x97TI\xfb@O\xac\xd9\xebIBc\x99\xe8\xe2tLm\x83(\rF\xe2#\xc1)\x9acg\x12(C!\xb6\xe3\x0cb\xde\xf0\xa1\x05m\x83\'Z\xc2\xc9R\xfa\x8e\xf6\xc7I\xb8\xc6\x83Y\x00Qf\xa6x\x11\x8c2\xe8\xf0\xe3\xfb\x01\xc1\xb4$\x9a\xedu\x94\xd2\xb5a\x95\xed\xf9\xf1\xd3z1#W\xc1\x97\xd7u\xaa\xde\x0c\xc8k\x03lW\xf8\xe7\xd8\xbf\xb9\xca\t\xb9h\xe4\xf1$\x0fEY\x0fq\\>\xd6\xe70\xea\xba\xf23\xde\xa5xt\xea\x14\xd5\x87\x1e\x02\xd3\xe2\x81j\x1f\xfe\xe0/\xd2\x01\xb2\xcc\xa1\xb5\x11\xc5\xffJ\x8c\x02`I\xc25\t\x8c~\xccK\xf5F\xbeC\xb4\xdf\xfe\xcal448f\xb9-;\xda\x06?Y\xe2\x06\xbb\x1f\xaa\x9f\xe8e\xa7\xc1e\xf6v\xc5\xe3\xa6\x14Y\xb4xQ`+D\xa1+R\xf5\x95T\x13\xc2\x99^\xde\x02\xa76)\xb8U\x7f\'\x16\xf1 "%\x8d\x1f\xf7\x89\xd6\xfd\xc2i\xc4\xadf\x15\xbe<aK(\xfe\x7f\x81\x96S/\xdcf~\xc47\xb0\xf1\xfb\x12\x9b\xb8\xcf\xec\x18\x89\xb3!\x14\x1f\xa3\xa3\xe4\x9dD\x96\xae2\x84\x8e\x87@\x16\xf4E\x84\xf6\xe2i\x91\xdb\x80Y\x9f\x14o\x9e\xbe7\xa4;\xe4\x13:\x17o\xee.\xee\xa3\x1b.~\xf5\x84j\x1a6#\x91\xf2;_\x9dd"\xfe\xab\x99,9\xd3",\xed\x188[_b\x89z\xf3\x18\x8bg\xdcH\xfcE\x08>\xb9?t\x1e;\xbd\xa8\xfd\xe1{\xe9\xb9\x99\xac\xf2\x05\x92\xabp\xde \xbf\xa1~\x9fsH}\\d\xab\x8d2\x0ed\x8d\xf6\xf6G.I\xe9\xd3\xa7\xa6\'<0&k\x89\xe4\x90\xd5\xbe\xfc\x12\x83\xb0Xy;Q\xf0\x1a\xab\x9a\x13\x06\xb7\xa9X\x01\x85W2\x91$\xb1x\xe9\xa3\xac\x8e<\xe9\x9b\x02\xa3\x95,\xd3\xff\xcf\xcf\xd0l(?\x8d\x07]\x15\x01b\xbdJ\xb2T\xd8k \xe6\x0fT\x99\xd5\xa8\x15\x88\xaa\xb4\xa9z!\x10m7t\xd6\x12\x97\x02VJ\xd1\x91\x18\x91\xeaATaw\xc2\xedW\xe5\xab\x88\x86\xc8\xcf\xf8\x03\x8d\x1a8%\xd9+\x18\xed\xd2\x06,\x90\xc8\x0fd*xF\xf2\x08\xf5zX\xa0\x7f\x8c\xcaZ\xf5\xa6`\x03\x80#U\xefwj\xff?\xa6/\x13\x07(\xb1#\x90\x97[\xdb\xa1q\x02c\xc9\x12\x08\xe3\xaa\x98\x87\xb7\xcf\xe9\xbb\xd9c\xf6V\x83jY\xbe\xcf\x07\xdc\xa9\xa9\x04\'-\x08G\x1bn\x8f\x0b\x1f\x10P\xe5\xa2\xfe\xcd\xbd\x8f\x8a0A\xa3\x1c\xb3%\xf8n\xa2\xa2\xccA\x06\x8e\xa0e\x89\xe6\xb9\x0b\xb2\xd7\xd1A\x96\x8f\xd2\xa4-\x9f@P-\xd1\x9as\xff{=\x91\x9d\xddWK\x18a\x93\xdd\xe9\x04\xb0\x0b\x90!\xf8\xfa\x10\x009\xf4\xd9i/C/\xc0c^L\xf5\tV\xb2\x8a\xaag\x11l1\xc5U\xb8\x95\x12?\xb1\xa7\x84S\x03\x00.^\xaa\xf8\x16\x15KF\xf9(\x85Ep\xa5\xb78\x03\xb6\xac\xb0\x949\x0e\xf5C\'\xdeE\x9c0\x1c\x9f\xb1{\xac\xbd\x8c\xbe6\x1c\xef\xa1!\x1d\xad\x1cdk\xa5\x141\xf5\xc5\x80\xb5\xcf\xa2\x8b\xfes\x83AFt\xe0\xa4\x00\xcb\xf0?\x8c\xcf\x88\x81M\xb2\x19f\x96D[\xb3\x1eaKJIO\xe9\t\x94nl\xe8#f\xff\x91\xac\xe0v\x06$\x91\xb0\x8f\x7f;^l\xa0\x05_<D8\xdeG\xf1\x96\xf8M\xe9\x7fh\x94\xccG\xf0Om\x882u\xacH\xd5\xfe<t\xc7\xdb\x9a\x7f-\x93\x14\xc4\x1a\x8b_\x87\xec,\x04B\xf4\xee\xd3\x1dU\'\x92i\xdf\xceHH&I\x99]\xb4\xc6f\xb6\xd6"jv\'\xabOA\xcf`\xd16f\xcbd su^\xa6B\x84\x11\x82\x8a\xccI}\xd3\xc6\xea\xd2\x9fZ?\xb9\xf3\xeb\x8c\xed\x92\x8d2C\x87<\xd6\xcd\x7f\xc6$\x14\xcfan\x89xq\x9b\xdfx\xa1\xee\xa0~\xddy\xcd,\xd2\xbf\xc8\xdcs\xa8|\xf8\xb2P @,\xef\x0f}&\x89\xd4p\x88\xf5\xea\xdeRxl\xbcy\xbe\x84\n\x16}\xc2\xb2\xf4Q\x05\xae}\xdb\'ur\x00\xb9fds\xae\x83\xbe\x03&\x07\xc24\xcd\xae\xe6-\xd3v\xd1U\xf0\x96\xb3\r\x1c\xf2!\x9bt\x15\x95\xc3\x94\x9c\xf3\xfbr^B\x1e\xbe$\xa7,G\xfe*\xa9m\x18\x1f^Z\x08\x9e\x0bI[\'@\xb4\x1b\x90\xbe\x992~\x05l\xa7\x94\xd9v\x18\xf5\xe60\xf5\xe7\xe2\x8d\x90\x1b\xb8\x96\xd4x\xc2I$\xd9\xc7\x8ck,$\xee\x87\x7f\xbfZc^\xbc\xb9\x05\xfeVS\xd2\xb4\xee(\xf4}e\xe9\xbar9?\xf0+\x90\xca,\xad5<\x87U\xd8\x81\xf6\x00ZU\x08=\x85\xbbD;\x86v\xe5\xfe(D\xae\xf9\x83K\xb1\xddP\xba\xf6\xab\x01\x1623|\xdd\xa5\x12-\xf8^\xbf\xed{k\x80\xef\xf1L;\xfe\x81\x0bTy\x83V0\x99\xeb\xe2\x96\xeb\xf6\xc97e\xd9\x8f\xbe\xa3\x04\xc9U*/\x87-\x8c[\x83\x9e\x89\xab\xfe{Y\xack\xf7\xf68\xd9\x98\xd1!%\r\xfd\x91\xd8j\xc4:\x1b\xe9GRI:`/\xcb{\xae\xcf\xbf\xba\xe7-\xbd\xfc\xec\xf8\xf7\'\x9c\x10\x0e>`\xf5O1r\xf8\x8b\x1fs][mel\tN\xe5\x07V\xd5\xfb\xa1\x10\xfc\xa8\x8a\xd9y\x1d\xa1\x98\xb5\xe8(vk\xd4\xbc\xba\x02\xd7\xe4Ua\xb0}\\BM\xbe\x03\x94\xf6\xe9\x9f\xad\x8e\xd4\xaaH\xc9:\xaf\xf0\x88\x82\xa1\x99\xed\xda\xaa\x03\x00\x89\xc5\xbe\x1c\xd0\xdf\x08\x9d\xb1\xfa\x14\x9a\x1e^\x90R\xc6\x1a\'U\x12\x91\xd80\xde\xf16PII\x02o\xb7\xc6h\x12\\A\x95OY\x95c\xd3\x8f\x8c\xc4\x1f\xa7rn\xccz%jan\xa8\x17\x8a\x96\xc4 \xdeTr\xb8#\xea\n\x1fF\x88\xa4qG\x9a_hT\xda\xdd\xc1<\\\x14\x96gI\xb6\xfa\x92\xb7w\x9a\x06,oS\xb08\x86n\x1e\xbf)z\x881\xed\x90\x1e\x01v\xd3\xda+\xe7`&8\x00A\xff2+\xb3\x0c\xc2\x92\x0bx2\xf0\x16\xf1\x87\xafzkX\x87\xe0\xb1x\xdfQ{\xdcy\xc0x\x9b\xa0\xfa\xa0Z\x96\x0f\xa4K\xdbJ\x1f\x1e#\xf41=\xb2\xdf\x86tt\xeb\xd5K@\xd5\xee\xe5\x89>\n\x89\\o\xe4\x0c\x9e\xae8:\xffF\xc3)|\xce\xc5\x95q&2,*+\xd6\xe3dq\xc98<\x8b\xa3\x9fF\x858ha\x0cC7\x16\n=\xbb-L\xbb\xe8\xab\xa4[\xd8@<7yoBWn\xf5\xc1\xa8;\xd7\xdd,IW+H\xcc)O\xa7\xd1:$b\xe6`\x10\xffL\xcc\x8fR\xbev\xd6\x9d\x10\xd7\x1ec4\xc5\x1eOS\xe8\xef\x04x\xd8 \xe9\x83\rY\x9f{S\xad\n\x04\xc9\xa1\x99=g*\x0c\xe3|\x993U \\y\xb6\xb8O\xd0\x9f\x08\x14\x85\x0cK\xa8\xf4"\x89n\x19\xfe\x92\xf8 -\x97]\tZ\xfd\x9e\x04_\xc4\x8e\x81\xe9\xeb\x86D\xf8\xba\xb2\xbd~\xad\x8c\x81\xcc\xd8L\x8b\x918\xff\xfa\xbb\x8b\x1a\xefE\xd0\x1fd\xef\x96J\x86\x1b\xa0\xa1\xdc\\\xc6\xda5 &\x97G4\xfe\x80\xa3\r!&\xe1\\.b\x05\xbd\xde\xdb}\\\xcd{\x04\n\xfa\xce\x10w\xbf\x12\x06\xda\xa1\x10\xea\xb0\xf1\x01\xbd8\xd05\x8fP5\x86\x1f\x006\x14\x99\xd3;\xb3TN\x82\x9c\xa7\xfe\xee\xb3\x92\x13\xcd\x9d\xea72\xb6\x9d\xfe0\x18\xa5\x9a\xaf\xdf\x1a"\x9f\xaf\x0eD\xee{\x87\xdd\x94\xb7\x87\x9f\x90Y\x02C\xaf\xfeS{1e&\xee\x97\x9e\x87\xeb\xb8\xe7\xca\xbd\xbd\x05\xf0\x94\xaa\xc56\x8f\x91\xeeY\xfay\x88}\xfeL%}\t\x17\xe3\x0f\xb2\x1ez;\xf4\x1a[lk\x06\x17\xcd\xb4w\xd3m\x11\x9c\x0e^\xa2\xb5\xc63j\xf9\x1b\xcay\xb5\xc9\x87\x1f\xd70\xe9\x11\xff\x8e\x88kD\xcfhP\xc4\xd8t\xca\xf9\x19\x97\x8a\x16Q\xc5\xc9~\xb5\xb3e\x93"\xefm#kD1A\xa5\x0f\xb6\xfc\xfb^z\xf3@\xae\x12\x9dF\xfe.\xb3I\xcf\x18\xbc\xae\x7f\x08\xe5\xc5OX\x99\xcd\xa1\x92\x8e\x81\x90\xd6\x96zA\x12!\xb5wI\xaaQ\xfcNUu\x88t\xb5#\x87Dq\td\xb7!C\xde$\x0c#+42W\xd3\xad\xf9\x9f\xe7\x8e8\xde\x1d\xac\xca}\xda\xa5\t]E\xc7z\xb6\xd7\xcb\xc9\x02W3\xa0D\xec\x96b\xfd\xd9q\x00\x11\xa2@zX_D\xee\x96v\x90\xe7\xdcW\nR|\xb9U\x9a$\xba\x9f\xaf\xeaK\xe02\x85\xd0\x9d\xdcQ\x0b1\x11\xa1mso\x03H\xeb\x06\x8d\xd3\xf6(\xe1\xf1/\xd9\xfeS\x02/(_DK\xfb|\t\xcc\x1c4g\x12\xe4\x06wR$A\x86U\xb4Z\xf6J\xda$E\x10@\xa9NT*\x8bG\xcf\x01\xea\xe8==%\xa9aw\'~\x82q7\x19\xab"}[vTN\xffN\x06\xe1\xb6\xb5T\xb8\x91\xcb>\x08h(\xcdf\xcdG\x9b\xec\x14\xcf\xea\xd5\x03-\x10Wp/\xc8\xa6\xe7\xc0\x01\x11\x00\x12^\x96\x86wL\xee\xc3\xc7\xc7\x03\xd6o\x1bEd\xff\xf1f\xa3d\x84#\xd6`\xc0\xdb\xa45\x8dG\x83\xbf\x03zV\xad1\x00\xe2\x1d*\xc6ykX\x12\xe5@E>\'8\xe2H\xb6\x1as\xfd K\x8d\xad\x0bMls\'\xb8\x03{D\xb6X\xa4\xa4s\x1f\x8f\xc9y\x99\xb1\xb6]Y\xddN\x9f*/\xf1\x8a\xdb\x97vx\xac\xc4d\xa1zi\xb4\xc1\xc2\xb0B\xc7L\x0fD\xcc8^}|N6|#\x8e\x11\xc0|\xf7-\x9e\xc1>\xd3\xb9\x0b\xb9\x1e\xe9\xd8\xee1s\x87h`\xb5\xd2R\xb9\xa7\x17s\xf7l>\nz\xf6qT_9\xd4g\xc2^2\xa9\xcf:\xf3E\xe9\xc1\xa0\x9e\xf2\xa2{\xe2e\xfdG\x08o\xed%\xfck\xd4\x10\xba\xabV0b\x10\x0e\xa4c\x08\x17<\xc1p\xe8\x1fz-\xfd\xc2\xaaax\x1f6\xb3F\xa2A\xed\xf0\x10\x8d\xb1\xae\x1cV\x82Q\x89%\x16\xde\xe7\x0f&\x07#\x88\xe8\xc2\x1b*l\xea\x95\xe6_\xb8\x1c\x92N\x04E\xaf\xdd\xb9\x1f`\xfb,"\xe2\x8d\xab:\xa5b\t0N\x92\xb5\x95Y\x98\xc3\xd9\xc8\x00~\xdaQ\xfb:\x10\xd5\xbf\n\xbc\xa8\x82\xb6\x88\xbe\xbfQ\xdf,\xfc\x15\xa0n\x1f\xe4\x94\xadk(\x92\xa0\xecs\x8aep.mX\xad\x02\xe6\x8e\x8d\xdb\xf1\xed\xbc\t\xc7\x1a\x90\x93G\x026\x17v>\x0c9\xc9r\xbe\t\x8d\x911\xaeXn\xb2z\x8b\xc8\xb8\x08N\x8d\xde\x90\xf6\xdfs\x0f\xd2k-\x18\x03\x18),\x93|F \xeb\xf1:\x03\xec\x82EE\xa7\x82\x9c\xc8\x99\x8f\xe4\xae)yo\xb7\x90&\xe7\xfa"\xe0\x1c\xb6\x994\x16x\x9d\x93\x19\x1e\x00\xfa\\\x01\xd7\xc3E_hK3\x1bu\xb0\x98g\xfb\x10\x17\x1fZ\x86\x0c3{o\xa8\x14\xd1r\x1a\x8a\xc2\x08\x0e!f@\x05\x9f\xd1\x95\\\xb7fZ\xc5\r\xea\x15\xe3\x05&\xe9\xeb\x05\x02`\xe9\xe0\xc8y\x8f\x9e]\x02,\x84\xb6Y:d\xbf\xe8_\xbcuv\xe4\xf0Il\x12\xfd\xd2\xc7\xa7\xb5\xe9\xc5\xef\xce\xf7\x8c6\x85J4\xca\x11sw\x17\xf6\xbc\x84\xd2i\x81\x87\x07\x11\x86\xffj\xe8\x93\x14\x87d\xf5\xf6\xf3a\xf8\xadY\xa0\x973kqZ\n\xddt\xa5\x9a\xd5\xb7_\x06\x0e\r\xa6#\x05\xf0\xa0\xf2BF\xa3!jy!K\xc7\x06\xb7\x1e\xba\xce\xbc\xd9\x8405\x8f7\xd4\x9b9\x1em\xabE\xe3\x9c,\x1a|\x1cZ\xcc\x18\x12Gv\xbbm\x8aP\xbf\x89\xfa\xe0\xe5)Kp\x0bP\xff!\x9b\\J\xfd\xd0s(\xca\xb0v6\xc4\xf2\xf6W\xe4\xcb\x88\xfe\xc2\x80\xb5TrO%\xed\xceY\x0eI\xca\xfc\x88\xaco\xdb1\xf8\xac\x83z\xa60\x94\xfdA\xcc\xd4\xa5\xa7ab\xd94M\x83\x066\xca\xff\xfe\x11\xef\x08\xf5/t\xcc\xcc.j\xa1\x94B$\x00\x07\x15\x07\x15\x93&{v\xaf3X,\xeb\xb0\xe3\xf2\xd5p\xc3\x07\t\xd8\xc8.\x97\x9c\xcb\x17S\xac\x0fM_G\x9f\xdd\xa2g)Fqa\xb6\xb8O6\xe3\\*\x13\x1f\x1ad\xc5i-\xcd|N\xd5{V1p\xec\xc4C\x01\\\xd8\'AH$\xfe\x86a\xb6{\xb7=^X%\xd6\xe5\xfe~\xd3\xc7\t\xa8\xb5\x8fG\x86.j\x8e\xe1\x91\xb4\xca\xae\x10\xdf2\xef\x11\x86\x83S-\r4 \xc6\xe5\xa0\x0c\x97\x14\xbc\xe4\n\x80b\x06\x8f\x18.\x92\xa5\xc9\xa64+_\xd7\x9e\xa0.\xccjr\x85\xf5\x97\xb5H\xc1\xa1\xcfDd\xc3\x06\x137\xe3\x1a(hH\x12\xd0\x08#\x1e9\xf9\xdec\x9c_\t^\xf8\xbf\xcb\xd9\xb6\xc18\xf0\xb9UI\x89\x87\'\xc2P_\xdf\xbf\x17\x8a\x1eu\x86\x8b\xb6\xc5R\xea^D^]\xb1LX\x17\xef)\xd0Q\xc4_W*6&\xde\xc5\x85\x85\x92\xa52\xdaj\xad\xd0\x1d\xf1\xb7\xa0\xe2\x83\xdb\x8a\x8eS\xee0\xd6\xb4\x04\xe4i\xb3\x14\xd8@n\xbc-\x87%\xbb\xb5p\xbf\xab\xbf\xb3{\xb1;\xfcsQ\xca\xab\xe9\x15\x07\xe9b/\xb4}\xc1\xd2lKV\x00my\xf9\\\xe3\xe9\x90#\x935)\x9c\x99\xaa|\xfePbr\xfd\xf4\xc5\x97\xa9$8\n\x1dRt\xd6\x1c\xda55[R?\x7fI\x92\xd7\xce\xcd\x84\xa92\xb9v\xeaZ\xfb\xe4paK\xe1\xe1\xd3\xe8\xf5\xc8yr^\xa2/\x118c\x0fT\xff<\x07[\xd5\xc1i\x91\xa6FT\xffxK8{s\xfcW\xd3p\xbe\x17\xf3\x1cS\xfe\xa4\x95C\x12\xa2\xce%\x00\xe4\x00\xe6Xh\xa4K0=\x10w\xcf\x1ew\xa2re\x84U\xf3]\x1b\xbd\x9eq/\xa1\x86\x0e>\x97\x1f\xe3\xc6\xba\xc9\xc6\r\xb8\xc3\x9c\xa6\xb3|\xdc\x15\x9c\r\xd6t\xf4iC\xe3\x93\'|\x85\x08\x91{^\xac\xc3\xcb5\xce\xa5\xa9L\x93\x04N}\x97Wi\xcc\x97U;\x00\x8d\'C?\xddVw}\xf0\xc1\xaa\xb0\xc3\xd0\xb1\xfc\x9db\x9e\xb0\xf5\xd5\x8d\xad\xed\xf8fuC\xa5\xa8\xd3\xbf\x18z\x8bdJAA\x88\xbcP">\xd9ll\xf2\x9di\xc6b\xe9\xda\xc4\x1c+\x91\x85\x7f\x97g7\x8d\xfc\x9d4\xf3\x84K\xa8L\x94\xba\x8a\xf3;.\xb0,\x8c\xe9\xd5\xfa\xc9k\xc9\xd5\x98B7+\xe2\xc9\x14\xb2A\x0475\x1ejya\x91z\x106\x06.\x14\x12(E%!\xf7\x9d\xba|\x92\x9b[\xa9\xe9\xecgzlcO\xee\xd0\xe7\xb4\xdf\x96\xd1pm\x11\xee\x93\xb1uz\xb1X\xd4~\x18%\xf2N\x93)\xb7\xe5\xa1\xa2\xd7\x0c\xcd\xde*\xef\x89\xf0\xfbzoz\x9a\x000\x94f\x17\xdb\\\xfb\x90\x19\x8c\x98\xbfL^\x1d%\xdf\x16L\xe3+\xacl\x98\x907\xdd\x94,u\x80\xe3&\x1dND@q\xc1G\xd5\xcf\xbc&\xe8/\xc1tB\x07\xe6Z\xcc\x8d\xfeu5!\xa6\x14\x91JmuW\x96\x16\x05\x7f\xd3I\xe0\x88\xb6c"o\xe7M}-\xae>\x9a\xce\xa5\x19\xcd=~\xffdr1\x7f\x86\xa3M;4\xa2\x89\x15\x0fa\x96\xecS\xe4).\xe0P\x0b\xe6\xb5\x8c.\x907\xb5\x91\x0e\xfd\xf0\x0c\x90\xec\x87\x9b\xe7\xa8\x0f\x95\xbbYe|Z\x1d\xf6\xa6\xf9\x9d\xeb\xffEm\xe7\xcf\xbb3eC\xdeXm\xacO\x93X\x05,,\xc7\xd6\x02\xdf<\x04\xe5F{A.\x00\x15\xd6\xa8\x92\x89\xba\xc6\t\x1b\xde\xeb4vI,aC\xf8n6\xf0"&|F^\r\xaf\x0b\x03\xef\xd1\xae{w\xc1L\x81\xb7\x14\x15E\xaf\xa8g0\xe9,C\xa0\x97\xf1uT=Q\xc4\xc3\xb2\xc1\xe8\xccEx\xa6\xe3c\xcb\xd4!]\xde\x8cE\xd5\x8d\x0e\x93,\x18\x14\xca)\xbe\xd2\xc9\xcd\xbd\xa4\xbb\xb3\xab?\xf9\x14>&J\xe8O\x9f\xa8\xf5\x89\x95:\x8b^)\nd\xb8c\xdf9\xaf\xe8\xf628\x06@f\xcb\xde\xe1\xb3B\xd6\x1e$,`\x1f\x84.\x974\xd0\xea!\xd3\xbe\x91\x117#\xc2;m\x8c\xe6\xb4\xff\xe7\xb8\xd9\x82\x97\xda \xaf*R\x94OG\xcb\xc2(1\x94{}\xf4\xf0b=\xcbm\x85\x83\xe5\xd2\xdf\xaex\xef\xeb\x05\xeehX\x87a\x04\\\xba\xf3/\xe3w\xa6\xabz\x9d\x87\xe0\xc5z/\x915\xb5}q8,p\xe7\xb1\xd1\xd0\x11\xe1\xc2\xc25E\xed\xa0\xec\x9b\xcd\xf3/\x85\xfaK^\xc8gQ\xc3YC\x9f{.\xb1\'c\xb0\xcc\xa9\xbd\xbc\xc2\xe7y\x1f\x86N=\xc4<\x0cK\xef\xe0\xcbz\xe7@\xbaw\x0f\xa4\x01\xe6\x94\xed\xb0\x96\xdbc)\xce$8<\xe7\xe9-V\x15?\xe9s\xea^91r\xc6k\x15e\x03\x843\xe0z\x1e\x8f\xd9\xc7\x06v\xa02\xc9@\xe4\xb0\xb89\xa0\x7f\xb0\x8a\xc4:\xfc\x1a\xce\xdd\x18_|\x89\xb4E\xe7\xc8\n\xd9^\r\xdc\xda\xb5\xc9^\x99E\xb2\x1f\x83\xa0J\x9f\xb27\x10\x99\x16\xb5(\x88k\xf2\xf0q\x86\xd45\x91\xc6\x96\xa9+\xea\xea\x03]\xf7A_j8_\x14\xcc\xb3\x8bf\x89iF]\xffD\x8e\x91\xa4\xa0\xb9\x9a\xca\x98\x94\x1c\xfd%q\xa4\x1fG\x07\x0bv\xbf|#H\x16\xcfV#\x9dP\xa5)m\xde:J\x07J+\xf8\xb94\x0c#-\x99\xba\xa7\xafa\xef\x14,G\xe9?\xb3sh\x116Q\xc9\x14\xde\xcf\x88w\x0fd\xf3\x1c3\x86\xe5/f\x9f\x90\xd4\x08&\x8aL\x1b\x1e\x14\x03e\xdbA\xa7\x9d\xb3m_3AzH\xe1\xde\xa8\x82\xa2\r\xee\xde\xae\x82\xa8\xa7\xd9H%V\x14\xb9\x08\x083\x0f\xf6\x83\x1b{\x82\xd5u4\r63h\xe3(i"\x93D+0\xccf\xcdY \x99\xa0\xbb\xfbF\xc0yAz\xb2\xa0\xc0\x93\xb4\x89\x8e\xe2\x06<@l\x94\x14\xe5\xea\xde\xd2\xa9\x95\xf2O-U\xdepH\xdf\xa4\xa1\xad\'#^\x8d\x97\xc1K\xddT\'\xae\x95*5\xa5L\x1b.c\xb3\xc1`b\x1c\xfe\x92i\x9d"\x8a\t:3c\x88\xa7A\xddE\r\x85N}a\x84\xc6\x80\x958\xcf\xacLbY\xee\xa9?\xc4\xee\x0bB\xedx6\xae\xb3\x13\x1fjH\xc4R\xb4\xc8\xe4\xb8\xa8k5\xdbV\x80 \xb0\xfd1\x92\x9e\xdb\x06%g\xdb>I\xfe+K`\x11\x1a\x12f\x83\x1e\xd1\x00E\xe9\xa6\x08s\xce\x10\x11\x01\xbe\x93]\rOH\xff\x88A\xc5O\x86*\x8b\xaf?i\x88jaP\xf2?\x00~[X\xa2\x82\xd0p\x97\xbc\xa4l\xc6R\x9a\xb2\xaas\xdf\xa3\xd2.n\xcb\x8e\x97\x15\x98\xed\xb8\xa1\xc3\xa0H\x17C\x87}v[WV\xb4\xf5*\x80\xcb!@\xec\x01\x14\xa1\xd2\x99!eA\xcao@\xdb\x8f\x13\xd3\xd6\x92e\x90)|\x90\xb2f\x9e\x82\x84\xc1\x8d\xd7L\xbclT\xe0Q83\xf6X\x01\xb3\xe3\xa5#\xfe\x93K2<\xa0\t\xda"x\xa6\xbb\xbf,(M\x91Q\x93\xc6!\xe0\x9f\x19\xa2G<\xef\x90tS\x1b\x9a;\xef0L \xcf\x05\xcadEr+\xb9|8\x93"\xdb\xbfl\xf4\xb2Gw\xe0\xac\x9bl:n\x81\tn\xd7\xc6i\xb2\xf9\x88P\xdf\xd8\x17\x08\xe0\x9a\xedR%/\xb7\xd8\xc4\x01BH\x83\x0e\xd0\x1d93\x98\xf1\x8e\x92a\tJw\xf0N|\xf5\x82\x05\xe4\xc3p\xe4\x1b\x01jHP_N\xa5.\xb2\xbc\x1bR|\xd4\xe3yn~\xe3A \xe1X\'\xe0L\xfa\xf0H\x99\x833\xcb@\xa4\xbf\xf5E6\x1e\xc1\x80\x16v\x1a\xfa\xfd\xc8E\xa7,N\xc1\xf7<4\x82>E\xbc\x05\xc3b\x98\xef%\xe92\x84\x87\x90\xfa\xabM\xf8\xf5.\xb7*Hw\xc86\xffr8\x08z\xda[\xfa\x8fF\xda>\x1a=\xd3\xdc\x8b\x13\x8a\x1ed\xcfB\xd9\xa2\xec\xed\xf5_\x9c\xfa&\x97i\xce\xdc\xa6\xd6\xb6\xb5\xcdjJ\x983\x93\x06_N\xbd\x92\xb1\x9fs\n\xeb?\x9c\x17\x9ah(g\x05\xc0\x17-\xf0O\x8b)\xec\x04\t\xd0\xf5\x92\n\x89\xd3\xe6JN\x9c\xae&\xcc\xb4\xfa\xf1\xd5\xa2\x10G\xd9>\x89\x17\xd0^\xb6\x1d\x06\x88\x7f\x08]\x0e\x81\xf9q/T\x1c\xb9\xb4\xee\x19\xe9\xc1\xb9w\xb3N B9\x1f\xa2\x05\x9bOB\xc1\xbe\x0fu\x15\xc2\x04n)yl\x89\xaag\xc0\xc1 \xd7\x03cz\xafP$l\x0f\x9a\x0ce"&\xcd\x94hb\xf4\xb7\xde\x98\xbe\xd0\x9c\r2\xf9|\x7f\xa9\xd7n\xe7\x81\xcd\xd2;5\x8d\xf2\x981\x87\xda\xf8\xc1\xf8\x97\x90Y\xa4\xe8le\xb2\x83\xc0`\xdf\xa1,\xb6\xf7\x1d\x98.\xa8\xfb\xac\x86\x8e\x0f_X\xad-\x18p\xd6=g\x0c\xdafs\x13\x99o\xcd\xf8\x9c\xa7L\xb0\xd6T,\x0e\x9c\x8d\x00\x04\x06\xd7RV^~\x86h@\xb9\xb5M\xff\xc8)\xf9\xbcn\x9b\xce\xb0L\xf5(\xaar6v\xe0\xd9\xf6VD\xb5\xe6Pf\xd1\x03\xad\x80\xdd\x1c\x92\xca\x0e~\xdb\x95\xa5v\xe4:\xc81\x842\xd0\x9f%\xd4.<n\x9a\x8c\xb4\xe3\xb7\x85\xbc)\xb0\n\xcb:\xca\xcb\xa1fo\xd6A\xba\xd3\'yb\xea\xc29^P\x9ei#\xac\xfd\xad\x84\xbe\xa1ir+\xe5y\xc9\xafq\xd7zh\xad\x99e\x1dj!\xaa\xd1\x81\x84}q\x80\xb8*\\A\xbc\x93\xc8\x14\xcb\xf1G\xa7\xaa}\x87T.\'w\x97c}\x05\x19\xb3\xd4\x7f\xef@\xcb6A\x02rn\x89\x07W\xde\x0f\xf8\x8e\xb8\xae\xd3\xa3\xe0\xb6\xbc\x8do\x8f\x1ce\xbd\xc2e\x91\x1d\x10j\xb6\x7f\xe8M\xa3\x85\x07H\xa3\xf2\xf3\xe42\xd1\xef\x83\x0e\xea\x00\xafv\xc9\x13\xb0N<\xba\xef\xe2\xd1\xadzv\xefi\x98#\xcd\xa1\xeb\x068\xe4\xb8\x04\xf1\x86\x93\x00\x96\xc4\xbca7\xd6\xda\x0fZ\xa5\x16\xa7N\xb5\xbd\x92HTZL\xde\xfe\xd5\x07zjA/\x84e&\x91A\x94\xb0E}\xbe\xcb\xa6.\x8a\x81\xdeH\xb6\xc6\xef\xc1)\x07x\x9b\x04\xf2!\xb7\xf96\xb1\xbd\x9d\x9f:\x9e\xb1\xcb\xaf\xdcp<\xf3\xcdC%}\xfe\xac\xcd\xa1\xd38Q\xfaSH\x15s\xa6-ow\x17\xff\xdd\xc2S\x8f\x9e\x9eL\x8cQnAtTn*X\x82a\x0f\x0e9\x9aq\x95\xbc\x90\x11\xfa\xe1\x15j\xb4\xc9\x85\xb9\xef\xf39JZ\x89\x9f\x00\xd4\xd1\x87/\xe9\x8e}$\x7f<\x0e\xe3\xbe\x92\xf0\xeaM\xae\x080\x0bq\x8fX\x05\x9a^0\x07\x8c9R\xd6\x9b\\|\x19\xd8!\x045\x9e\xa0\t]\xfb\x1a\xd97\xd8D =r\x8bv\x04\x0c\x89v\x18E\xba\xff\xe9T\x16\xedE\x97\x99\xb0\xfd\xb9\xa6\x06\x8as\xf3\x9e\xc7\xefD\xe5\xf7\x19\x06&J)Z\xee\xf5\x0e7\xb7z\x0e\x97\x0f)\x86\x1b\xa0\x8f\xf1 \x04\xb3\x87?\xc8\x0f\x05\xe49Hj\xf0iG\x17\x99K^\x9e\xe1\xb1\x90\xef\x81u\x06*e\xe6\xb7\xe8\xccd\xfa9\xe29\x82\x88Xvk\x80\x1d*\xbbl\xbd[\xcf\xe5\xeb\x9aGV\xe1X\x02\xc3\\B\xd6\x9a\x18\xe4)\xba\xd1\xf3\x82\x9c\xf1\xae\xc7\xc1\x91I\n\x85E\xd7\xd7\xfaX\xa6\xf5\x128\xf4\x17V\xbc\xdf$\x17\x83\xfa-\x15\xee\x16\xf5\xb1\xc5\xfe\r\xe3\xd0\x97+Q(f\x15\xdf\xcc.6m\xb2H\xc3\x03\x9e\x08S\x9a\x12\xdd\x91b\xfc\xfa>\x9f\x1b\x1a\xeb\xbc\xabI:;[\x1e Y\xa0\x85$\xe0\x04"\x96\x15\xde\x15m\x94\xf7\x91ZU2\xe2f)\x88\x87_\x18*P\xe4\xa7\xc8ac1\xf5y\t\xb1\xb1\xda\x91\xd1\xac:\x1a[\n\xf7\xbf\xedQ\x1b\x9c\x9e\xdc\xf7N\xdf(\x17{\x00\xa1\xa5\x1d\xdf\xc9\xf3\x92\xbd\xd6\xae6\xa8\xd0\xf9W\xe7\xcb\xb8\x9a\x94C<\xf4w\x12T&\xfd$\x14\xfa\x93o\xb7\xe5\x92d\xd0\x07\x1e4I6\xdfU.p\xac\xaa\xb6G\x1c^\x9c\xfa\xd8\xaeL\xbf\xe4\x8f\x16\xe6\xe65\x99)\x94\xa3\x11Bfs\x80\xffa\xac\xe2cYR\x84\x83\xb7Z\xf6G\x86|\xda\xaf\'z\xdf.\xbf\xdd\xa1\xcfo\xf2\x1b:\x0b0\xea\xd9/\xb2\xbf\xa9\x95X\x01\xcbl\x19H\x8d\xd2M\x10\xf9n\xb5U\x91\xf4\x0c\xd3v\xf6\x17\x04\x13 \x7fL\x90\xfd\xc2\x83\xc8\xa2\xa8"\xf8\xa6GXL>>Kj\xbc"q\x9c\x17\x84\x99\x8b\x10\x00\x96j\xb5\xd7L1\xcd\xdfb\xd4\xb1(x\xdb\xcc\x9a@3m\xaeb\x92G\xef\x19p\xf2nr2N\'E\x94s\xc8]\xc1\x85\xa17\x98\x9c\x06\xe6\x8c\xd9R\xc1\x19d\x9a\x9f\x8c\x88#\x1a\xe8n\x94\xdd\x89\x8d\xceL\xcd\x86\x18\x1bl\xc5\xb6\xc4\x13>S@\x1f\xea\xff\xe44\x97\x8ax\x92\xce\xda"d\xd0 d\xe1\xff\x83\xf9=\xc1\x914\xf2\xde\x1dVQ\x1cv\xca\x10\xce\x07qB\x13Z\xcb\x1d\x9bT\xe7}^\xedE\x1a(\xf1\x91dO O\xfd\x91\t\xb8\xf5\xb2\x9bG\xd5\\\xce\x1d\xeb7\x89\xe5\xb7\xdf\x1dNt+\x1c\x947\xc4W;\x95~B\xd4o\t\xa9\x9b.]n\xcd\xf9\xa6\x8d\xea\x17\xcd6w\x9c\x1dX(\xe3\xe70\xcd\xe7\x8b\xc2\xf9\xa4\x19\xaf\xf3\xc4\xac\x88\r\xbc\xd6\xe0\xbb+sE\xa2L$O\xe4gr<,\xff\xdb\xb1\x02\x02\x19\xca\xf7\x1b\xfd+dY\x13\xede0\xc6\x15\x9d\x1a\xda\xd3\x8a\xf2\xa5\xfa\x85\xaa0\xb3$E\xbb(\x07\xacS\xa3V\xba\xdb\x0bv+H!I\xe2\xde\x9a\xa0\xfb\x86\xf9\xe32\x16\xb0vF\x87`P\xb9q+\x16\x07\xa3di\xd5\xee\xe4x`\xe6\xe7\xa9\x91fw\xa5\x9cdi\x8e\xe1\x8cC\x8f!p\x1a1Rb\x13gC\xd9\xd3\x17\xd9:O\xa0oZfoD\xdb\x88\xc6%\xd5ZNl\xdc\xb1V?\x80$\x8e\xc1H\x1b\xdf;x\x0e\xe3:\xa0\x13s\x82\xb8\xf3\x8d]f^w.\x19,\x9eEz\xbd\x87\x9c\xe0kB\xe4\xb2N\x13\x81?\xed*RAf\xff7\x9e/\n+\xe6\xfa\xd2|\x14\x8f#\xe3\xcc/\xd7w\xb8BP"\xdb\xder\x98pd\xc6\xb8r\xc4\xff\xc7|Ht\xd4\xee\xb7\xe5;\x8a\x95\xce"\x92\xe2\xc6\xca5\xf1-|\xc0\xde\x9c\x88\x10\x14\x93\xc8\xb2\xa5\x14\xee+j\x9b\x05\x01\xf0\x80~Q\x92\xc5\x0e\x1dgO\xf1p\xdb\xd7\x81\xf0\x00\x03\xaf<\\\xe9\xdd\xfb\xe6c{\xca\x0c\x8ej\x17xi\x1f\x01S\x9f\x17\xdc\x89\xc9\xcfGM+fa*\x99v\xa9p\xfdE\x85\xe7\xf6\xb7.\xc9\x14\x0f\x07\nS\xf1|\xf6\xb9i.\xe9\xe9kR\\\xb4\x9c\xdezcM\xd0d\xaf@\xd3\xf3-v[\x01j?\x88\x12\x83\xb2d\xc2}\xb8\x15R\xeb}\xec\x9b\xcf\x07\xf6>R\r\xd62\xd6\xc3P\xf8\x9cR\xac\xd6\xeb\x8b\xed\x12\x94\x96\x11\xfe\x845b\xa9|\xe0\x85+\xcag\x85>\x91\x07x\xca\xe5\xffb1b\r\xd7\x0b\x89>\x05\xb9\xd2*c\xd3,\xc5\x89\x98\r\xa0\xfbf\x955\x9ao\xc3\xb4\x06\xa5\xa0C\x8a0\xd2M\xd0\xe15\xa9\xcd\xb5\x15\xe1\xb8\x89Dr6&\x91\x9b\xc4O3\x97}\xc8~=\x85\xd1\xda\xb3L\x87\xdc\x1ea\xddf\xabk\x0f\xaa5\xb0\x8c\xd0z\xaeQ\xc1\xbc-T\xe4\x04\xea\xed\x1c\xa7\xa0[\xd4\x92Z\xc89)\x05\x01\xc4\xbd\xf3V\xf0\x9ad\xa6v\xff\xdf\xa5\xf1\xb7\xc95u\xc9\xfey\xca\x16\x0eH\x18\xa7\xb83\xcb\xd9\x131\xca\x03\xe9\xfd\xd2Y\xddLa\x03Z\xc3X\xf9\xac\xde\xe5\x9a\xef\xd0F\x02\xd7v\x84\xf3\x82\xff\xa7\x9b96\x93\x14\xefrOit[I6\xd5C\xc8\xb9N\xf1El\x89\xfe;O\xfd\xccpi\x8d\x9c\x99\x05[D,\x80+\x1dVZr\xda7\xdd\x9d\xec\xc5L\xb4&\xc1\xbe\xe13%9.\x17\xbfu\x06\n\xecKn\xbc\xd6>K\x06dW=\x88E\x9dE\x98\xf08\xc8>\xa8\x93\x05e"*\xe7\r85@t\x83\xc2}\x94P\xdcF\x18\x05f\x89\x16\xc4\x1fV\xb3\xc7\xdf\x1d\xde"F\x16c\xd65\xaf\x95\xcc\xcc\xac]o1\xe6\xe6\x9d\xd9\xae\x86/T\x95\xc5Z3\xa4\x1a\xafr\x18\xe0\xad_\xc7\xa2\x07\xa7})\x8b\xb7\xdd\x1d\xa6ct@\x93\xb3E\xad\xfa\x1d=\xac\xd4d\xc2\x00\xa7r\x11\xba\xfa\xad\xb1Y\x90C\xd4{J\xe7F{\xc4\x10\xec\xcf\xaa\xec\xa3X\x8a^j\xfb\xb1\x1f\x1d\xe19\x9d\xcdQ\x1e\x91\xcd\xf2\x02\xb9n\x00\x9bcy\xb0]x\xdd@f\xe8\t+\xdd\xbaOY\xfb\xd2\xa6>\xba8#<A\xd2\x06\xf3\xe2\xb0\xb1\xebE\xb9U\x93\xa5\x80\xe5\x8d\x8dOs\x90\xfck\xcfH\x14\xfb%[\x0e\xc8\xe6\xcci\x0c[\x01\xa7\xeaO)]\xc7`K@\xf8/4\xae\xea\x83\xc5ZC\xfc=)7o \x06\xfexhP\x91\x0f.j\x9f\x029\x14Mg\xe8\xcf\x1e0\xa1}Q\x98F\x1c`\x0c\x95B\x9a\xa6Jt\x9d\xf1\xdcFn\x0c\xdb~2\x9fV<D\x91\x93\xcb\x90\x94\x7f7\x017\x7f\x9e=\xfe\xdf\xae\x1aY\xc0\xfa\xd3\xf4\xa8\xf0\x10|\x95\x9c}\x13n\xbb]\xcc\xce\xfc4\x1e\xc0A\x84*\x05\xb1e\xe8\xdb\x86[\xda\xe0\x84\xd2-\xac\xe8\xd3\x14\x9f\x9c\x0e\xf5\xf4g\xbb\x7f~~,\t\t\xd4^\x90F\xd4\xdc\x0e\xf0M\xa6\xa0\x87\xd6\xf9T\xc5\x8c\xf7\xeb\x82\xf7l\xeb2\xa0\x83\xc3!0,\x1cOu\x08)A\xcf\xf7r\xf5\xd1\x18\xb6.\x06\x04!\x96\xe6_\xd7 O\xa1l\xbc\x91+\xe0\xdcL\x0b]\xb2*\xc5\xc3\xaeS\xfe<\xbcOW\x9aTH\x05\xeaz\xb6\x893\xd9(\xcc\xd3\xdaa\xca\x02\x85^@G\xac\x10\x99\xa7\xbc\xa4\xe7Q\xba\xfa2\xc6q\xcde.\x9aN\xaa\x1b\xc1\x1dg\x8c#1\n\xc0L\x18\x9c!\x9b\xb2Cq\x11ZH\x06G\x9c.c]\xd2\t\xa6k\xbfE4?\x91",\xd3\x9a\xcb\x15e\xd5\x90l\xa6%n\x95\xf3\xe1\xb9o\xf3\xb0\x8d\x15\xea\xba\xe9\x05d\xae\xc7d\xdd\xa0\xa0\x137~\x91\x11\n\x0fz^\xa0]\xc5\xbc#\x88\xaf\x9e|\xb4\x84y\xce\xc0\x1b\x1d\xedU;\x15\xeeh\xe4\x80f\x8br\xe5 A\xaf\xed\xdc\xee\x8fs\xf0\x11N\x8f\x0cN\x05\xed\nv\xa15\x8b\xef\xabGVk\xaf\x14\xa2\xcc\x8bx=h<\x0e|\xde\x91h\x18\xa1\xa6\x04\x1b\xda7\xb4\xb5]\x9e\x0b\x11\xb2\x95\xbf\xee5\xd7bg\xd7\xb1$"\xdbt\xf2cO\xcf\xb18\xb2\r\x04\x131t\xf0#\xcf\xe2\x96+C\xb8\xd8m\x93k\xd0m\xca9\xb1\xe9:\xf3$N\xc3\xe1&O-\xf5i\xa6\xe3\xcc\x14\xd2\xd4\x8a\xc9x\xc7v^+B\xce\x93\x94\xd2\x846\x1a\x15\'\xb1[M\xdd\x9a@\x17\xb8\xdfd\xb3Dp\x16\xf5*\x10\xd8\x9c\xa8|\xae\xa6\xf3H\xb8.\xe0@\xceT(E\x90\xd6\x85\x83z\xf8\xa1u{\\z\xd03\x9d\xde\xa8T\xac\xdb3\r\xc1\xfb\xca\xaa\xa1-\xb4\xe0\x1f}\xff/\x93\xcbMe\xb3\x89!m\x84t\xa7\xd5\xf9k\xc8O\xa0N[\x12\x1e\x88\xdcy\xe2\x91\xba*\xfa, \x86\xf7F-\x84r^W\x9d\xe4\xdfvS\x91\xf1a,m\xa1\xa8\xf1e\xb0\xa6q\xc9!\x91\xfbb\x00p\xb9\xf1`u`\xd4]\xc3M\x1aa\xd6p\xb6aA-\x93S\x11\xc2\xdc\xe7\x0b=\xf5\xbe\x1d\x8c\xf9\xde\xa8\xce=\xaeq\x17\xe5\xf4\x8f\xbd\xa9\t\xc3 \xc7<\xdf\xaf\xb8\xaa\x1c\x84?\xfb\x02\x05fU\xd7:\x1c\x91\xa3lV.6f\xe4~\x8d\x15\xa7\xae\xbc\xf7d\xc5"\x0e<\x1b\xa6A~\x0c\xc5L\xf5(\x8c\xd3\x1e\xbeZ\x1a*+\x82\x14\x07s\x17\x82t\x99"\xcd\xc5P#@Ss\x00\xb9\xb5\xc5\x91\xbd\x1e\xc7\xc1\x87\xc6\x00r\xa7W\xfc\x1bfv,\x98]E\x06\x96\xdf\x8eJU\x84\x94\xa6\xcf\x867<\xd8\x11\x84\xcc\x99W3\xb3\xac\xbfi\xd1\xc8S\x07\xb2\xef\t\x91;C\xad\xf3%\x03R3j \xd1=>\x8c`\x80I\xbc(\xa4\x1b\xa0T\xeb0[\xdd\xee\xbf\xd0\x19\xe2\xe6\xe2\t\xa4\x0eV!6\xcc$n\xfb\xa9\xc8\xbc\xe6\x98\xffQ\xde\x1e\xb8\xbd\xc5\xe8\xd4\x02\xdd\x07\xbc7W\xc2(MuaP0\x88\xe7!O\x99\x8axhn\xb6\x7f\xd0\xdb\xefAe)|w{\xabZ,\x93d\x8e\xebK\xd1\xe33\x19[3\xe4Z`&\x15\xe7\x9b[\xfb\x14.\xab&\x0eN\x8a\xdbr\x80\xd2\xa9Y\xdaGW8N\xac\xd0\xa8\xe4\x1e,K\xbc\xab!\x04:\xba\x92\xa1\xb2\x17\xfb!\xb0\xeaZ\xd2\xff$\xa6d0hP\xa7\xd3\x80\\\x16\x98\x17\xaa\x08\x10Tv0\xc7\xfeZek\xb4s\xee\x99>\xdd;3\xe5Q\xf2;5_\xbaZ\x1c:W\x02H6\xac\x12_Zw\x8dk\xeeU\xd1f\xd6\xce\xd7)\x19\xee\xadU\x05\xee{x\xa8\\\xd1\r\x81l\xc7x\x1b\x15\x9a\xb9z\xa3~\xd4\xf2)\xa52\x82:\xf8\xb6\t\xee\x07\x9e\xf4\xe6\xeb\xacJ\xb7;\xba2\x16\xee5Q\x86\x86\x1c\x0c\xa8\x04\x16j\xf8\xf5~A\x04-\x04\x8d>/6\xcb?\xa5`\xbcp\x9a\x07PSA\xdd\\\x84\xbb\xf5zTu\x7f\xb0\xcd\xc4Ex1S\xa7\x14\xd4\xa73\xa5\x8f\xac\x1f<\xa2\xe8\xee\xce\xd6K\xad\r\xff\xe6\xfev\xc9\xe8\x1c\x10o\x14\x0f\x13\xf9e\xe3\xc6.\x8f5\xfb\x04\xad[\xcc\x9du5P\xd2\x93\xb99\xd8\xb1h|x\x0b\xc3\xb9x?g\xe0\x8e\x93w P\x83_\xb1\xa5/\xc7\xa0SF\xbb?\xa2\x9d\xdd\xd1\x9a\xa1\x17\x0b\xfd9\xac\xe4\xe59\x989A\x1c\x00u\xb8vAoT6R!r^\xda9/\x17\xf2*"\xfc\x0f\xd8^;\x17;r#\\\xd8ZTz$\x0c\xad\xf5\'j\x9bZ\xd5&\x06\xd7\x88\xf4\xa1B\\\xfb\xc4\x1e\xae\xc0\xc3cx3\xa7_\x19\xcav\xdb\x19\xb8\x14)\'\xb6zB[r\x96\x9ae\xce\x12-\x85B\x02\xeb&OZ\xf1/\x8f&\xd5\xfbH\xc16\xc7\xe4\xc1\xbd\xe1\xb4\x1b\xef\xdc\xda\x10\x82\x13\x1d\xddE\xd8\x17>q\n\xcc\xeb\x90\x06u\xea\xe96d\xd0Xr/\x94n\xc7\xe6\x1f\xdd\xeaf/\xd7\xb7\x82s\xb9\xd0\x98 \xda\xee\xf9R\xe1gM\xa2\xd9\x92\xcd)\xd5q\xde\x93v3\x9cD\x16\x05\xc2\x805\x1d!l\x15\x83y#\xb2\'6\x85\x97\xbe\xa2\x13#\xa6\xe1\x93d\x8a\xad\xc1\xe2\x96_4%\x88\xa1\xc7pYz\x97\\[\x15\xe5\x18\xd5\xad\'_\xf3\xe5\xafh{e\x89\x8463\x94:\x9a\xd9\x08\xe1~U\rz1-\x05\xa1\xdd\xa2\xb2M}\x0e{\xcd\xab\xca|\xd2x\xeb!\xd7f\nCc\xd7\xc1\x94\'-b\xb4\xfc\x87\xa8T\xb7\xde\xda\x10\x19k\x85\x86g\xb6VV\x0eGS\xa7\x1e\x04\xbf\xca\xf2(\xb66,W\x84\x89\x94\xc4\x13fx9\xc5\x98J=:A\x95\xfc\x04\x149R\x95S\x8dY\xd2\xb265p_\xd9\xa5\x005\x82\x0e\xc0\xf4\xe6R\x10;\x13\x98F\x85\xe3o\xfb!\x0e9\xe5\x19\ncz <\xc63]/R&\x8a\xbfb\xe5\xa8\x92\xf9\x93\xd6\xf0\xe3\x07\xcf\xd3\x1a\xbd\x1e\xe1k\xae\xa5\x8a\x0c:ul\x94YjU@\x8c8\xdf\xe5\xff\xbe\xbcG*\x18\xe3G8\xd3?\xd2\x81u\x00;i\xcf*\xf3K\xd5G;f\x8e\xbdd\xf4\xe0Z\x0eC\xf0\xe6\xe1\x1ec\xba{\x85\xef<\x06\x86\xadv\x95?K\xdfu\x96\xa7?\xe5\x8f~\xa1\x8e\x9fv\xc0A\x01\xce@\xc5\xfd\xd8e\xc2\x04#\xd5\xb2\xde\xe9Zg\x82\x8c\xb8!\xd7\x04\xadz\x89\xab\xddi\xe4t\xd2]\x10\x1et\x07\xeft\x03\xc9nA?\xa1^\x93-\n/\x91\xee\xe5\xf0\'w\xd9iM\xb7s\x13\\h\xe2\xa9\x84\xa6\xa5\x85\xce7\x9e\x1b\xc3\xb5\x7f\x1f\xee\x1a\x9c\x17}\xafp\xdf/\xc0H\xa8{=\xc5\xd3\xb5/q\xaa\xe7H3\xfa\xf0\x94\x99\x85\x97\x85F\x99X\x11o\x17\xc4\xad\xf9\xcc\xcf\xc4\xf3j\xae\x8b8k\xfezL\xedJ<f\xf3\xe9l\xf8[\x01\xac1\x91j\xe3k\\\x8a{\x80_\xe8\xfe\xc51\xb0\x0c\xf7\xb3\x05n\x83\xf3\xc5`\x07v\x9e\xd1\xdfF\xe2\xb6\xd4oxvA#\xa9\x18;j\x87\x9fxouI\xf6\xe3\x19\xfb\xe4\x18\xa1\xae\x94&\xde\x87t\x1a\\\x8d-\x91#2`\xab\xebl[-\xbb\x8ey\xb6i\xe0\xb4\xc4\x99\xf9\x9d0c\xf9\xd3\xef\x93\x80\x0c\x13\xc7o\xf5`=\xb66#5\tl\xaf\x90Tr\x95\x00\xba\xc0m\xef\xe1\x1b\x06,\x96O~\xea\xed=\x13\xdd\xbd\xdb\xedIi\x7f\xf5\xe0\x07\x14\x10\xbbz\xbcVkdDL5\x92\xcez91]\x1ed0a=2,f\x0b\xd0\xea+\xa7\x17\x86#\x17u-\xfa\x9d\x80Oc\xfd\xbd\xd5 |\xa4/\xd5\xf8\x1fb&\x0c3\xf5\xa7T\x14\x02V\xde\x92}\xd5\xfd$\xc5\xd0D\x9f3\xcc@\xf8m\x16\x80t4\xadCKy^\xdb\xbay\x08\xba\x1aOD\x0b\xee\x885\x80\x83\xc8B\xaa\nJ\x03\xc8W\xeaL\xe9J\xa91WY?\xafz\xb4\xa4\x00\x91\xcb\xd6\x0b\xa4V\xe0\x18\xb7\xb8\x18\xa0<\xa6\x1b\x14\xfd\xd9H*tO\r\xd7\xbc["\x02+;\xe1~\xd0^\xe13!\xa3\xb2\x7fF\xd4y\x011\'d0*Y\x02\xff!T\x84Xh\x1e\xec\xa0\xa7|n\xb8\xedX\xf4\xa3n\x87|\xed\x8cY\xf9\x94j\xfe\xb79\xf1E\x80\x9f\x17\x84\xf72%F\xa4\x07\xa3\xb8p,\xd3e9\x14\x9ck\x8aE\xce\xd3Ui\x0eB6C\xa7\xa4\xc1\xedm\x11Xf7(\xd2\xd2]k\x85\x9f\xe80\xc8%\xd8\xb0\xd1\x837z\xaf\x90\x16\xd8]\xda\xbf$\x83U\x96\xb5\x07\x01H\xbc7\xcd\n=\xab\x1f\'4^4\x92b\xbeA?\xef\x00\x92K\x89\x83t\xd6e\x84\xe0\x9e\xe9\xd7H\xfa\xbem\xd4g\x7f>\xce\xd1\x86\xe2i\xb1Y\xf7\x1d\xb2\x0b\x08\x03K\x11h\xe1j\x9d\xbb\xe1\xfb\xd8\xdf1\x90T\xa1<\x8f^\x88\x9c\xa1\xbfk-\xaf\x8a\xc1\x1e\x0f$a\xc9\x0e\xc0\xff3\xbf\xbb0\xe4\xec\xf6\xda\x841\xba\xfb$\xba\x17\x15Gs\xa9\xdc\xac\x8199:\xd0\xe0\xa18\xee\xdb\xcb\xd0\x1c\xbcodKV\x05\xf7\xf6\xb8\xbc\xc2[A\xa5\xda\x87 8\x1c0\x8d\x03\x00\x90*\xd7\x8c\x87:\x85S\xda\xd9\xcfRd\xef\xe6^\x03\xab5^\x8bCY\xe2\x06h\xae\x0bP\x0f\xbe\xb0L\xb2\x01\xf6\x84\xc8\xf7\xf8\xdc\xac\x94D\x99\xcb\xc0\x8a\xd6\xde\xab\x80\xc0q\xd5\xc5\xf6w\xc3\xf8\xbc/\xc1\xd2\xf0\x01\xcd)\xff;\xab\x8e\xaa\x8f1Q\xec\x18\x9a\xceq\xbb<\xfa\xc2\xe1\xb8\xcb\xd6>\xcb\xf74\xf1\xfa\xc7\xd1\x16\xdf\xeaU"\xed\x97\xfc\xad\xba\xae1\xcf]\xf1&\x95\xb9f\xa9Z\x81\x14\xf4\x83\xdfm\x9a\x89_\xa8\x01\xda\xb6y\xb3\x8a\x9b{\xe8\xcb\xe21\xcb\xe3\\\x8b\x83\xb7n\xc9=\xc8>kx8\xd1+\xdb\x14h\xb4\xac\xff\xff\x8ai^m\x81e\xd08\x8bsG\x94\xfa\x8f\x84A.\x93\xd5*<\xb6\xc7\xedC\xa2\x11\xd5e\xac:\xc4b\xec>\x17\xaaS\xac9-s.\xafE\xd0\x91\x04\x9f\xf9+\x1c\xac,\nm\xc13,\xad\xf1\x1aF\x85\x91\x82\xa3\x10\xc7\x9d,Y\x14\xdf\xc4\xa2\x1c\x9c\xb3\x90@\x0e\xc1o\x95\x14\xeeb\r\xdf\xa5\x89.\xcf.\x881\xf8\x81-]\x9f\tz\x9f\xd8\x03\xb9\xe4\'(VTO\x03F\x85\xc9s\x13Y\x1f\xb8`\x9c\'\x8b\xca~\xe7\xc7+\x01\x9e\x00\xd4\xa2\xef+\xfd\x99\xd1\xee\xe3B\xa7\x10\x91\xf8:\xb3"Gdw\xd5\xc7c0/\xf8\xe1\x1e2\tk2\xe0\xe7\x08\x0f\xf9\x7f\x8eeE\x90`\x90\x87\xf4\xae\xfe\x9c\xda\xdd\x84\x84n\x1e\xc8=\x16\xca\x90Jr\xe3T\xfa.*\xdb\xf8\xb3\xe1\xa1<\xc1\xffC\x15\xe6\x00\x02\x9f)?\x96\x9d \xe4\xbc\xbe~\xb7\x8f\x02\x99\xd2!\xc7*\x03\xdf\'\t\xdfZO\x1az\xe6\x80\xf5\x8b\xbd\xea\x13;\x9b\x87P\xe7Q\xa7\x00\x8f\xef\xdfRt\x8b\xe9\xa4\x0c\xc2\xbd\xdf\x0cb\x1d\xbeMN\xf0\xdcP\xfbf\x045\xff\xe7\x01\xbe*\x9a\x08\x7fmD\xa7\xf9{\x8e\x1f\xb7\xc9\xe8$\x8d\x9e\xc4\xae\x06R\x80\xa0\x10\xfa\xc2\xf4{P\x04"\x86\xd8E\x1c\xdd\xf5\x9c!\xf6\xe1B/G)c,$\x0c.\xc6\xfd\xba\x9e\x0b\xe4,\xda\x86 \x05\x1a$\x9c[\xf6\x86*8\xb5"\xb7\x14\x88V\xab\xd6\x85 |\xab\x8f\x01.\xc7CO%-\'e\xc6U\xf3S\xb9~\x0ci\x80F\xb7j\xdd\t3}rCWQ/\xec \xe0\'6EL\t\xa1.\xdc\xf8\xe0\xbe%\xdd\xaf\xe8\xc4\x8e\x82\xeb\xf0DbQ\xd4X\xe6\n\x18\x84\x0e\xc3k\x15\x12G\xf45\xec\xeda0\x91\xdd\x7fd*\xba\xb6A\n\x19\x01\xac\x99\x18H\xf3y}\x84No\xdd\x8d\x0e\xb8\xf673\x1b\xe8\xcdK\xbe\x88qH\xcaX}\xa8\xd1\x1b\xdd\x93A\xdb\xcfsO\x05\x87\x8f_\xc7\xae\xa2\xbf\xa0.\x8a\x110\x10\x8b\xd1p\x8a\xdd\x16\x8bza\x83\xfbz\x8d|7"~\x80r2\x19\x0b\xc1\xa9v\xab\x88\x15\xfe_o!\x1d4\x89\xa3\x1c\xb0.T\xea\tai^\xe3\x9f\xcc;\xac\x81\xad\xdb\xeb6H]\xabZ\xc7\x94\xd93f\xcdQ\xc8$?Whx\xa6`\xf3C9!\x02\xefI\x96LeB\x98\xcc\xa7\xf0\xe2\xf2\x1e\xe3\xcd\xf1FW\\\x12`g\x12\x033\xf2\x02\x12)-\x181\xb1\x963J\'Zgi\xdf33\xd5\x87\x9e\x05\xd3<k\xee\xc9A\xe5\x00\x93Q\x8e\xf8z{\xc8\x07\xa4[(vT \x86\xd0\x9c\x9b:\x0e\x91\t\xcd\x83\xa2C\xde\rM\xd0\x91\x0b\x82\t\x9aq\xb4\xb2X\x90\'O\xc4\xccK\x95\x172\x02\xd6\xea<V\xa8\x84\xb7\xbff;\xe8\xb8Y\xca`\xa0\xc9c\xb8CKV\x13~\xb0\xeaM\x0f\x19|bb\xba\x9d?\xc9\x1b;\xeeP\xbb6J\xec\x8d\n\x0cj\xe1\x0fg\xd3\xb0\xcd!\xef\x0f\r\xe4\x97\xab\xaf\x90\xbc.l\x84\x0eN\xb2!E\xecm\x081j\xfd\xf9#G\xd0\x07</\xbe\x91\xb4\xc3d\xcd\xc4\xe5\x92b\xec\x02\xf7W\x93\xabK\x8b\xcc\x88\xed\x0f\x16\xe1\xd9y\x95\xe0\x00|\xd0\xf7\x07\x97\x0e\x11\xa5=\x14\xb9\xfd\xdd\x81S\x8b\x1c\x9eo\xd4>\xe4`\x0bF\xfcu"\'\x8cg\xdcB\xbc\x1a\xa7\x16\x05\xc8\xf7\xb6\xc4\xac<\xd4\x978v\x8d\x06\x1b\r\xda\xcb\xc8\x7f\x81\x8a\xa3\\D \xa0\x84;\xb2\xe4\xa4r\x92\xc1jj\n\xe3>\xe5\xf9\xeb\xfaS\xa216G\x0735I\xb4b\x9b\x089\xee\xa11\x16\x0f\x8d\x9dx\xcc.\xee\xdf\x0cL\xb8R\xab"\x96^\xf1\xed?`gA\xd8[)w\xf5%\xf7\x16\xbb,\x94\x98\x89\xfbp\xe34\xb0\x1f\xb4\x02\xfe\xa8-\x1eK*\x19\xa7\xdfN\xef\xb0\x86;\xcf\xf9\xed\x7f\x18\x9b\xc75\x89\xd1\xee3\xca\xb0\xab\xb3\x05r\x9c\x0f\xddu\x18\xf4i,\x94d,\xb2s*\x91\xd5\\\x00\x99\x1e\x8c\xd6\xe1Z\x06\xf3\xb5\xb5\x7f\xde\xe4\xbc=\x16\xec\xaf\xc7y\xa0\x95\xa2S\xca\x02K\xfa-\xb9\x98\x1e"\x0e\nh\x93H|\x81?\xc2\x89f\xde\xc9\x85\xfc\xb5\xbc\xba\xd4\x12\xacS\xdf\x91\x1b\x89\'\xcf\x19\xeb\xce\x97I\x1b#\x90r\x17\x7f\xe9\x8c\xd3\x1f\xe98HA\x84\xe8\xb9\xe9\xe0\x893s\xd3\xaf\x84]\x00\xeb\xe5\xa3\xddH\xa3w\x82\xb4\x80h\x84\xfa\xbee;\xee\xe7 /l\xd9)\x94~d:M?\x9c^\xb0&4Lk \xce7\xd2\x96\xd1\xe0\xac:i\xc1`\xa3\x01\xa7\xb0\x06\xeb!a\xc5\xe7X?\x99J\r\x99\xb6k\xd38\x07y4\xcbM\x90\xdc\xe0\xb3\xccQX\r\x0bw@\xc0V\xdf\x8a\xebf\xd5\x80U3\x8a3_\x94\xe2\xde\xfb\x95\x82[\x91\xa6\xd7B\xaa\x91dA\xb0\xdb\xe9L\xbe\xe3\x89\xb9\x06\xc6L\xadwu\x06\xe3\x9b|\x00#\x19\xc7\xe7\xfeM\xe6\xa0\xaf\x11\xd1\xdd\x10\x17\xb1\x1b\x9a\xf0\xe5\x1d5\x86\xf0|\xdd\x91\xbf\\,b@I\n\x95\xd7y\xb5+\x02(&\xa0qE\x8et\x01VV%\x1c;\x89\xe7.\x90&\x8a\xbd\xd7\x90\x06>\x87\xad\x98\xed\xf6\x1c\x08G7\x87\x1d\x1b\xfay\x0c\xc3]f\x86\x1b\x1c\xa1U\x147?m]\x0e\x0c\xce\x9f\x07D\xbd\xae{XG}\x19\xb5\xc0\xa8\xd4*\x13\x05Y$2\xe4RW\x13\xfc\xdf\x08[\xb6\x03\xd1\r\xfa1+wx\'T\xf9]hb\x02\x11D\x00\xeb\x02U\x97\xc0q\xa4|\x83\xcf\xcb\xc3bRP\x1b\xc5\xb9@#xY\xa2\x18|\x85\x87\x84\xe1\xb1\xf9\x8b\x97\xb1\xdb\x1f\'\xf2\x0c\x92#k{X\xf0\xc5\x9c\rvUz~JO\xb6<>\xfee\x85\x95\xed\xa9T\xd4\xb9\xccEDC\xfa\xb8gi\x06\x12\xf1\x9c\xaa3\xe4\xc1\x92\x98\xcbt\xe1*d\xc3\x89\xae\xa2\xac\xfa\x16\xd1\x84\xd4G\xf7@$zr:\xa6\x0bM\x8fmz\xdd]\x1a\xbe\x8b-?7\xfc\xa2\x94TXb\'f\xb04\x86\x08\xa5\x1c\x96x\xbe\xfb\xfc( 3j\xcds\x91OL\xa8j\x06/&\xdf+\x90xJ\x80\x1d5|\xe1H\x8bU\xea\xc7\xa7\xc6\xce(\x92+\xe7\x8cH\x0f\x8f\xbd2\x9d\x12\xa0h1>\xd3\xbf\x1fq\x1c%\xd5\x98\xce\x163a@b\xe7\x04V\x8be\xffO\xc5W\xef\xa2\x84-\xd5\xee\xcf-\xc73\xe0\x8b\x03f\xc2TA\xcd\xf9_\xf6z\x88\xad\xa8\xf2\x8c\x18\xb47\xe0\xf2\x90\x13\xb1\x9ayT&\x1b9[@\x89\xb4\x1f)\xb8$#\xeb\x8b\x91r\xe02\xe2c\x1dKN\x02\x17\xfa\xf2\xfa\x14\x90]\x92\xda-\xfb\x86v/X\x8b\xbb\xddF\xf2\xb0\xd2\xc6\xa7L\x94\xdb\xf0\xff\x90\xf8\xf7\xa15\x88\x8a\x8czT\x95\xe4?\xf7\x8dm#\x00\xc9\xa3\xac\xc0;S&\xb6\x8bf\x99\xe75p:\xdd\x14\xc8\xeb\x93\xe0cn\xab\x08\x99D\xf4I\xc7\xcd%\x1f\xe3\xa9\xccw\xfc\xba\x99t]C\r7\x0c\n,\x88\xc5!\x1dr\x0e\x8f\x90\x8f\xc6\x80\x93\x16\xa9\x80\x18^\xc6\xf4!\xc1\xaf\xf8r\xfd\x18\x18\xc3\xaeO#\x0b\xad\xce\x8b\x14\xb1\xb9\x99ON\xee}\xce\xe9*z\xa8\xd6\x8b\x89\xf4\x19%\xc1W\x15Qc+\rj\xd9+(z\xdf4\x86p!\x13\xb8)\xc6\xb6H\xf1U\x97\xfao\xfb\x11x\x17\x03\xab\xee\xef(\xc5\x98q\x025\xf7&\x81}\xd08U:\xa9\xce\xe2\x94w\xb2\xd4\xe4g\xc8h.\x0f\x0e\xc0\xe4\x1c5\xf4}\xd7\x18\xdd\r\xd3%\xb8u\x19\x7f\xf9!\xb2\xe8~\xca\xc5.6\xda\x04\x11\x9d\xb2\xc8\x85\x1a\x84L\xb9bV{\xb0\xc8\xcf\x8bD\xfb\x03\xb1bw)\r\xcdk\x89\xee\x89S\xd4\xb2\xb2\x1c\x0ez\xfa\x9a-/\xd8:\xf12in\x19\xe3i\x82\x02]\xb2\xe5\xf0T5\x8cZ\xa5\x8c\xf06S\x1e\x94\xdf\x0cD2\xa2\xaa\x8fa&\x068\xaa<6PE!\xcb`\xabn\xdd\xee\xc4\x95\xc5\x12\xf8\xc6|\xbf\xcb$\xab\x84f\xfcC:\x96g_g\x86u\x86\xf9\x8aU\xc1\x9a\xe0g\x9e\xa1x{\x0c\xaa`J\xe0\xb8\xac\x1b\xbb3\xf8\x14\xe0i\x05\x88\x19\x13K5\x8e\xec\xab\x99\xc6\x00\xfa\xba\xb2{\x9f\xf9=\x07~\xaa\x10\xff2\'tF \xacjC\x91\x86G25\xe8Q\xc4X\xadA\xa0\xcd\x85\xb5\x9f\xed\x8ce\xc8\xa7\xc5\x7f\x14G\x00\'\xb7\xf2\xd2Mv\xc8\xd7dT\xf37F\x8e\xbf\xd9"f\xe0\xd6?=sM\xec2\xd0\xd4R\x88\xcd\\,\x91~T\x92\r\xf8\x04\x17/\t\xf5\xae\x97\x8a\xff\x15\xc2\xfa\xf1\xde\x9a\x18w)\x90\t\xfb\x9ejg\xa5\xbf\xef \x1c2G\xe9\xd0@\xd9I\x01\x02;\x0b=X\xe7>\xa0\xfd%\xe2(\x10q\xe3s\xc4\xfdP\xc6\xf7j-\xa0\xe0h\xc2=H8=\xb4e|\xb4o\xf3)\x85\x17\x01\x02I\x8d\xfaU\x97\xbe\xb4\xac)$\xb9c\x90\xcfI\x0b\xce\xcax\x9f8\xd0\xb5\'\x9a[@\xfa\x01X\xbc\x97\xb4\xeau0\xb0.sps\x1f\xd8\xd5~.)\x8fw\r%(\x95g\xa2\xbf\xed$3\x90\xff\xfe\xe2\xb6E\r\xfa\xefZq\x05\x85b\x03\x1a\x7f/\xff-\xe1f\xc2\x1fQ\x04k\x99\xaem\xfe\xdc\x08\\\xa1"0T\xa5\x93\x13\x8b[\xf6\xd4\x91\x1f\xa5Y8\x97.c\x03\xb9qS!\x0b\x88\x81\xa6\x9f\xf9EQ\x05=\xba\x0f`\x82\xbe\x80d\xab\x80\x1a\xf1g\x12\xc0c\xfc\xbbi<1\xa2\x920\xf3\xa2\xc9\xd85\xf2\xf7\x98\xb3\x85\xd9\xb4\tZ\xc4#\xb7\xd8\xafj3\xbc\x81W\xf0\x820\xc5\x15XF?\x90\xab\xe6\xb1gj\xddm\x83\x91\x0cN\x9aI\x19\xc1\x97\xa1\xdc\x8eX\xa3\xc1\xa6\x87\xb2\xf4X\x0f\xff\xe6\xe6\n\xff\x86\xa7\xc0\x7fc\x81\xf5\x92\xf5\xa7\xcf\xc7m\x1e\xd7\xb0i\xe5\r\xd5*\x97\x83r\x1a5\xafZ\x8f\xebh\xa8)\x96\xf1\x83\xd3%\xb4s.}\xbeSh\xc4\x82%\xbd\xd4\xd1w7U\'\xb0\xdb-6#\xa5i\xf1NN\xc7\x02\x96d\xcc\xc8\xdaH\x95^Jf\xeb#7\xdb!\xa4c\xc1\x92"\xad\xe4\x1e\xfe\xd6\x9a\\Eh\x94\xdd\x99X8=\xe2\x8bOSg}n,!\x8e\xbdF\xa5+\xc5\xde\x0b\xa9\xd8\x00V1J\n6I\xf5\xff<\xd4\x1b\x18~!H\x15\xf9\xda>W\xa4Q\xa7\x8f@\xbf\xb1\xd7\xf3\xc5\x9b0\x07>\xcd\x05\xda\xe7\xdeQ\x9d\x19t\xcc?9u\xeaI\xa3\xfb\xfb#\x8c}\xc8\xe4\xa3xr\xf2W\xc8\xf5\xad\x9c\x04\x85D]\x89k\x9f\xe5\xfc(\xfc\x92\x88H\x05\xf4\xbb\xfa\xdb\xbe+\xe5\xdeN\xaf\x1d@u\xb3lDZ\xf8y\xb5x)\xe2f\x1a\r\xc9\xbc\xca\xe0\xbd\xf5\x9a\x02\xb9H\x19[\xa3\x01\xc9\xf1\xb4?:M\xce\x0bL\xb5q\xbe\xdc\xcb\x1d\xbfs&p\x88\xce\x80R\xabQ]\xd6m\xa0\x13~\xa1\x86znU;\x91\xe9\t\xcdL\xf07\x18\x13\xf2\x96J\x92\x01\x08<oH\xbd\x1d)\xb0\t\xfd\xa8\xee[u\t\xce\x8d\x9b\x00\xb8\x86\r\x06s\x8ccO(\xd4\xe7\xb5X\x06"">\x81\xf3*\xf5#I\xd2\x8b\xa7\xcfU\xb4\xda0(\xbf\x80\x028\xb5W\xa2e{*\r\xc2\xc7\xe4\xe2*\xf7\xc4\xaf\xabR\xccw\xf4V\xc2\xff.d?X\xbf!{u\'yN)\x0e\xb5\x9d\xd4Z/\xcc\xda\xc4\x9a\xff.qYxB\x14\xf2O\xf5\x0b\xa6\x85X\xdec\xd8\xb3\x84]\xfdD\x14\x03\xc8h\x92H\x0c\xd1\xfaJ\xa1.?b\xdc)\xee\x94\xf5\x7fO\xb2\xd5\xc5e\xe5\xf7\xa7z\x7f\x86\x83\xf3V\xd6\x9e\x0c\x08\xe3\xe3\xb6\xf3N\xdb\xac\xcaQ\xffN"\xd5\xd3\n==\xac\x1d\xf2\xeaU\x83f\xddo\xf6\xdbkJ\x1d\x0e;\xcf\xc8)w\x16p\xe3\xa30\x9a\xfe\xe6\xdc\x89D\x1c\xabq\xe0Lc3PC\xf7\xac{v\xd4\xde\x8e! \xcf\xa7\x90\xea^ey\x98\xb3U\xb3Z\xfe@\xa5c\xc5\xc2\xb5\x07\xe2\x86\xb0\xa8,\xbe\xd8>\xb1z\xf7\xbbY\xa4I[\xb0\xf02\xdcMo4\n\xe5\x921\xeeg$\x04!\xe2\xf8\xdcTL\x1c\x98\xc0\x17lQpD\xa8:#\x88\xf6Bv\xe1P\xa5.-4\xd9\xce\xdfvU\xa6\xa3\x1f\xf3\x88x\x9c\x0cv\xb1\xb7(\rY\x9c\x1cG\xffv\xc9\xf3\x03b\xa4\x8f\xa0\xb2D\xd4\xbe\x95\xdb\xfb\x02c\xadk\x18\xb9\xa9\xca\xa6\xdd\x96\xf8\x8d\x90\xd6\r&\x17\xfd\t$\xb4\xf2m\x13\xc0\x0b\xad\x9b\xb7\xdc\xbe\xfd\xee\xf9!\x08\xb8y7ov\x06\xd1\xce\xaf\xba\xe2&\xa7\x9b\x80\xd61\xa9\xb5O\t\xcd\x96[]\x13\xcajb\'/e\x12\'=]\xd0\xc0\x1f\x1c\xe9\x12\xff\x88\xe0\xb9\xf3\xac]\xba3\xd5J\xbc\xea@\x1b\xbcY\xa4}\xebv\x07\x8c`\xdbk8\x03\xf0\xea\xb6\x04\x18V7F\xe7\xf1\x7f\x0b\x95S\x17\x877\x9b\x011\n\xb9;\x0bM\xf8\xf5\xbd\x81\xb1_x\xf8@D=\x08+\x159,\x9b\xd1?F\xcaE\x07\x8d\xcd\xbc\x11\xb74;\x17O2\x86\xe9*7\xa4\x109\r\xc9(\xb2\xff\xadUc\xe9\xa4\xea\xa4KbK\xbdR\xb0P\xd7\xfe\xa8\xfa\x02\xa4\xcd\xb2\xb3_\xeaf\xe5\xdblYKS{|R\x8a\t\xb2(Lu\xd6\xfa\xb8S\xe8\x19\xf6\xef\x08\xe7\x9c\x95C7\xbf\xdd\x14\xaaH9\x88\x9a]\xddz\xfa\xa3c\xfc1\xa8!\xe3\xea\x96X\xfc\xd5R\x8c\xc4\xf4\xfd\x8b\xd5V\nD\x9e\xe9U3b\xf7\xfc\xff\xea?\xfe>\xa4\xd0\xbf\xe0~\x9d\xa5\xe4\x07\xb0}t\x0bL\xea\xfa\xf8\xef\xee\x11\x17\xe0&e/\xb4u\x88\x06\x1a^\xb8\x1d\xf8\n\x98RI\xf6\x92v\x87WH\xd7\x10\\\x93~\x918\xd6J\x04\xfbI\x91\xf8\xcf\xe3(|D\xb6G\xca\xc9D\x19\x94\xae\x84\xc4\x92)\xd8mCyX\x82\x99L\x9d\x92\x00\x12\xe6I\x01q\xee\xf4\xcf\xbf\xe6g\xb2r\x1c}N\x15\xd7R\xab\xa1\x82\x89\xd3\xf3/\xdc#@\x81\x12\xc2Z\xe7B\x9fk\x94\xc3\xbe\xc4x\x18^\x1f\xb8\xcf\x15r}\xc5q\xb0\xa9\xb0U\xcdSe.\xf4\x05\xc7\xc1{u\xf0c\xbf\xc6+V\xb3\xeb:\xf1\xd3\xb9\xc6\xd1?Slq\xdfTlH3\xdd\xb5\xd8\xefo\xe3\xdc\x89\t\x1f\x914?\xcd\xa9\x87\x7f\x04\x1c\x86B\xa0\x18\xefBosP- p\x97\xdb\xe2\x8e\x02\'\x86\xba\x02?\x85\x08"y\xfb\x9f\xaf\xe1\xcf4\xc3[oe|\x06W\x88\x942W\x85,e\xe2\xcb\x13Jy\xb8\x83\xb5\x00;q\xfd\x0e\xa5\xc7$VK\xcf\xcfO\xefL\x15\x146NA\xed\xc8\xb8X\xe9H2\xa4n_\xcc\xabKk\xc8X|Z\xfa\xc7\xff\xcb\x92B\xce2\xc3\xc83\xff\xc9\xce\x10\xa8\xa3\xbd\xd8|\x8f\xc0\xd9\x93\xda^\x85\x95\x05\x01Lh\xfb\xdf\xd8F\x9e\xf0Sbj\xe8\xa2\x90\x85\xd8\xca)\x9e\x9e\x0b\\\xd1\xe5\xd6\xad\x0bV\x06\xbf\'j\x95\xfcT\xfa9\x8d\x91\xb3i\xca\xea4\x89\x00\xf6rZw*\x19\xd4\xff\xb4w\x01\xc6\x05\xc0\xe3hl\n/i\xeb6\x17\xa9x5\xc8axp\x1a\xa9\xbb\x8f\x9e\xd3\xdb\x06x_\x06o\xf7\xd1c\x01\xec\x8c\xdd\x86\xc9>\x80\xe3&0\t\t\x02\xf5\x95\xed\x8e\xc1\xde\xb9\xad]\xec\xcc\xde\x01\x94\xde>|!\xa0k9\xd5\x95n\xd5\xffe\xb42\xe5\xca[L\x82vl\x8e\xd7\x19\x94d\xfe\x8d\xd1k~iM{nH\x91\xa2\xdc\x1f\x1a\x13Jwv\x13\xa1"n\xaa\xdd\xacEX\x08\xe6a\x95U\xadU\x91\xdc\x87\xee\xaf\xff:\xec\xdc\xa7\x17]l\x1a\xbf\x98\x9f\x91;\xc3O.\xe2`w\xbd-\x91\x8aDD\xa8p9\xa6\x93\x93F=<`\xfd4\xdd:\x9e\nMO\xf73\xde\xc5\x1d_W\xa0\xcb^J8`Y\x1c\x02*\x0e0g\x83\xe3_\x9bG\x897\x97\xcf\x15\xaczc\xd87\xea\x91\xec\x81\xe9\xcc\xe8\xe7\x0b\xd9\x83\x06?y{\xda\x9e\xd5\xc4\xb2\xe3_y\xf5\xaa\x07\x9c\'\xc0-\x99nAv3\xa9X\xa1\xae\x08\xdc\x16~\xad\xb5l\xbb \xf1\x9f\xa1\x7f\x190\xcc;Y\x96\x06\x98O\xc7\x8b\x8a\xae\xfbK\x08c\x94s\x13\xd2\x998\x14\xd4\xedQv\x0bF&\x01\xab\xb0\x90{\xcbg\xb7\x8e\x03Z\xfe\x0b\x9e\xa4\xbc\xd4\x18\xd3\x94<;\x082\xc7A\xfc\x1a\xd8s\xcdTo\xae\x01\x01g\x86\xe84\xfb\x97\xc2@\xe9\x8d\x9eU\x88\x8c\x9fOm\xe18s\xda\xa001\xe3\xbf\x91\xbe>\xde\xac\xa2|f\xc3\xde\x84h\x13\xe3*\xdd\x18l\xe4k\xe1 PI\xd8}\xce\xb1D\x98\xef\xa1w\xe5\xc1~\xf8\x8a\x16\xc7\xb6;\x19\xd92\xdc\x87=M\xaf\x94Pa^i\x05\x99\t\x1a\xd4X\xd8}\xae.\xe6Y7\xdbx\xf8e\x0b^\xd0\xe5#\xf84\xb8\x9aW:\\I>\x96\x7f!\x88\xe3\x1a\xfe\x1cYG\xfd\x9eb^}\x8a\x7f"\x16\xc2g\xddr\x1b\x19Fk\x0e\xf0\xb9"\xe9\x90Z\x85\xae_\x03\x82\x00\xfcf\xe7\x87\xcc\xfb\xb3\x00\xdf\xac\x8a\x91\x1cj\xc1\xad\xe4a\xbc\x91\xa8;\xee\xb2\x9d\xb9\xa0\xee\x88M\xb8\xcb\x14\xdc\xe8T{\xcb\r\x07]\x81\xb7\xcdi\xf5S3\xa0\xf0\xeb\x8b\x08\xee\xde\xcf\x8duO\t\x00\xcd\x91\x85\x9e\x8b\xf3\x8b\xac6\xd5\xde(c\xbf\x9aX\xfc\x89\xe9;YC\x8b\xa2n\x96\xddn\x0e]{\x9d<\xa0&{\x8b;\x1a\xba\xda\xdf\xb9\x96m{\x07k\xc3\x11\xb1\'\x07\x08\xb7\xaf\xdf%\x9d\x1a\xdaY\xf8\x89\x0c\xf3\x97\xf7\x10@b1>\xa9\xa8\x83\x84\x9e\xc4\xb1\xa0\x14\xed:\xac\xf6\x8e\x95nM\xbb0\x07\x9a,\x9c5\xf8\x9e\xa1`1u\xe9\xbd\x12N\x7fm\r\xcf2\xbb4\x1c\xc4\x9bH\xd8\xd5l\xb6\xbf\x90\xb6o\xc7\xe3L\xc0\xf7Q.\xfbk\x01\xfc\x1c\xa0x*\x17\xdd\xa3\xb0(\xe9\xf1\x1b2\xd2M\x84\xf5\x11I_A\nIK\x9ed\xb5\x1aF\xab7\xfe\x87P\xbd\xe2\xbc\xe9\xdf\xf8+\xa4\xbeW\xcdF|\xc8/i\xce\xab\xbe&\x88K/C\xa8d1\x87\xad\xcf/\x9b\xad#g\xc2C\xd3|\x1d\xa9\\\x19C\x1d\xf0\xf3w\xa1({0\x1b\xcdg.\t(Y;\x08Aa\xf4\x969\xc6\x99FM\x7f\x83\xee\xcc\xbeYa\x1fhTJcK\x08q:\\T\x0e\xdb\xa2\xda\x8e\x93Vc\xf3\x99\x83\xe3\xdb!\xb8\xbe\xb1\x92\x8a\x8f\xac#\xc3\xb9\xcc\xae\xba\x98\x0cY\xf8.\xa4\x884\xfe\xa5\x10_pJ\xd4{\xc0\xec\xd6\xe3\x7fy&\xbb\xf4GgqErX\xcf\xb1\xab\xaeBM\xc4\xfe\xd7\xe2\xb6\xf5\xde\xfa\xc9<\x07\xf8\xf7\x88\x16}\xa8\xb1Q\xbd\x8c+Q\x00\xd7\xb5\x97\x82`\xc3\xc8s/\xe6:4M\xf8\xce<\xe6\n\x1a\x96k\x13\xe7/\x89w\x84M\x9c\xb4s6\xba\xfc\x1aw\xa4S\xc76\x19 \x0f]\xdc\xae\x90\x93\xa8a\xfd zc\xeeu\x87p\x05\xfdZ\xc7h\xa8+{]\xd4 m\x02\x0e\x98\x91m\xf3\xa1`\xce\xa3\xd8\x84\x07>\xe6\x99\xc3\xaa|\x8aI\xa6\xbfJ\xc3\xc7\xba]\xf2\x9bo \xcf\x95h\xa8V\x8e\x96\xf9\x95\xdc\xad\xb2!\xe7\xd2\x0eK\xe1Ix\x12A\x0e\xab\xf9\rS\xb1\xf8\t\x19\x94\xc4\xe7\x98\xbf\xe2\xe2G\xe6\xef6q\xe2\xb0\x8e?\x96Yr]\x00\x08J\xbd\xf0`\xa9\xea|\xd6\xe9&\xf7o{\x88%\xb8sa\x8f\xb3qi\xc7V\x9eh\xe6\x1c7\xa7Z\xfaL\x9aA\xaa\xf1#\x94\xe0\x03e\xffk\xdboB\x84\xdd\xd80\xd1\x7f\x07!\xf3E\xd4\xac\x90\xc7\xf4\xc1\xf7\xccu{L\x83>\xd3\xbdP\xc0\xd2\x1c9\x03\xd1X\x99|,\x87\t\x0b\xd1\xfc\xaasM\x96\xef~~\xf8c\xbe3\xaf\x14\x92r\xd2fs\x15\x80\xc2!\x8b\xc2\x99\x84\xf5iL\xd2\xf8\x16\x9d\xa5\xe7Lg\n}\xd3{\x16)}\xa2\x19\x97E\x948\x94*\xec2\xa0\x15\x16}/}H\x11\xfb\x11\xac\xd5\x15\xce)\x9e\xcb\xa5\xdc\xb5\x15\xf4c\xbc<1\xd8\xf8\xb0\t\x8c\x1f\x10\x14\x95\x14\x7fy\xfdZ\x04\xe4Gm\x18E\xeeX\xd0\x93y\xad\xce\x1d\x8c\xed\x1e{\xc2\x98#\x7f\xad\xd4\x95\xdc{\xb2\xf9\x89a\x1a\x1e\xa4\x9d\xe7\xfd".\xca\x1e\xcbg\xe3~Wu\xb3\xeb\xf8\x89\x928\xc8\xee\x0c\xdb\x00q\x94]9)|\x81\x1dq\x82\xdf\xa9\x0b\xa4;\xa4~P\xaaz/9\xd7E\xa3\xa6"\x91\xc4Jd\x10\xc2\x88\r\xf0\x8e+\x1f\x17\xf9AA\xce\x02\xcd\x96\xc3\x82\xdb|\xd2\xe5mT8\xe3\xf0x\xa1\xdf\xa5\xfb\xd6Z(9\xf2g9\xc3\x0c\xde\x11\x110\t\xf8\xd8`\xc7sN!\xd2\x9d\x83\xe7\xfa\x96X\xd6\xc5\xfe\xaf')
spk_001.wav ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:79de3a5775f8880c0bf3e950b103f03b257db630224fab265a309d82753b1aa5
3
+ size 480044
test.ipynb ADDED
@@ -0,0 +1,190 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 1,
6
+ "metadata": {},
7
+ "outputs": [
8
+ {
9
+ "name": "stderr",
10
+ "output_type": "stream",
11
+ "text": [
12
+ "/home/salman/salman/minomni_sn21/omega-v2v/console/backend/venv/lib/python3.10/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n",
13
+ " from .autonotebook import tqdm as notebook_tqdm\n",
14
+ "/home/salman/salman/minomni_sn21/omega-v2v/console/backend/venv/lib/python3.10/site-packages/torch/nn/utils/weight_norm.py:143: FutureWarning: `torch.nn.utils.weight_norm` is deprecated in favor of `torch.nn.utils.parametrizations.weight_norm`.\n",
15
+ " WeightNorm.apply(module, name, dim)\n"
16
+ ]
17
+ }
18
+ ],
19
+ "source": [
20
+ "from server import lm"
21
+ ]
22
+ },
23
+ {
24
+ "cell_type": "code",
25
+ "execution_count": 2,
26
+ "metadata": {},
27
+ "outputs": [],
28
+ "source": [
29
+ "from server import tok"
30
+ ]
31
+ },
32
+ {
33
+ "cell_type": "code",
34
+ "execution_count": 3,
35
+ "metadata": {},
36
+ "outputs": [],
37
+ "source": [
38
+ "import torch"
39
+ ]
40
+ },
41
+ {
42
+ "cell_type": "code",
43
+ "execution_count": 4,
44
+ "metadata": {},
45
+ "outputs": [
46
+ {
47
+ "name": "stderr",
48
+ "output_type": "stream",
49
+ "text": [
50
+ "\u001b[32m2025-07-17 20:59:03.022\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36moutetts.models.hf_model\u001b[0m:\u001b[36m__init__\u001b[0m:\u001b[36m20\u001b[0m - \u001b[1m🔄 Using patched RepetitionPenaltyLogitsProcessor -> RepetitionPenaltyLogitsProcessorPatch | penalty_last_n: 64\u001b[0m\n"
51
+ ]
52
+ }
53
+ ],
54
+ "source": [
55
+ "\n",
56
+ "rr = \"\"\"I'm trying to come up with a funny name for my new goldfish. He's orange with a white spot on his head and he's pretty energetic. Got any silly suggestions?\"\"\"\n",
57
+ "\n",
58
+ "inputs = tok(rr, return_tensors=\"pt\").to(lm.device)\n",
59
+ "\n",
60
+ "with torch.inference_mode():\n",
61
+ " out_ids = lm.generate(\n",
62
+ " **inputs,\n",
63
+ " max_new_tokens=500,\n",
64
+ " do_sample=True,\n",
65
+ " temperature=0.2,\n",
66
+ " repetition_penalty=1.11,\n",
67
+ " top_k=100,\n",
68
+ " top_p=0.95,\n",
69
+ " )\n",
70
+ "\n",
71
+ "resp = tok.decode(\n",
72
+ " out_ids[0][inputs.input_ids.shape[-1] :], skip_special_tokens=True\n",
73
+ " )"
74
+ ]
75
+ },
76
+ {
77
+ "cell_type": "code",
78
+ "execution_count": 5,
79
+ "metadata": {},
80
+ "outputs": [
81
+ {
82
+ "data": {
83
+ "text/plain": [
84
+ "\" I've got a few, but they aren't very catchy. The one I like the best is just gonna be called fish. It's kinda long and it's kinda boring. Oh, I thought you were gonna give me some name for the goldfish. I'm just kidding. Yeah. So, you know, it's really easy to take care of a goldfish. We have a big tank, and, we're both in the same house. So it's not like, oh, where are my three goldfish? You know, it's just, oh, how many goldfish do you have? It's, like, four or five. But, we only have room for one person to be a goldfish keeper. So that is hard, especially when it's, like, 20 degrees outside and you're trying to keep a fish at home. Right? Yeah. That's difficult. And with the tank being this size, you don't really feel bad about taking him out. You know, you just kinda get a little more nervous because you know you're gonna be doing a big fish transfer if you have that big of a tank and all that stuff. But Mhmm. It's much easier to take care of the goldfish at home. So I wouldFor the rest of us simple folks, we worry about somebody stealing our password. To you, you laugh about it because you know how to do that with your eyes closed, right, with the technology you've created. So nowadays, you talk to certain investors, so where do hide your passwords? I don't want to really say, but I hide my passwords in my notes section on my phone. Oh shoot. Okay. Where do you hide your passwords? I write it on a piece of paper. Where do you hide your password? I have it on file on my computer. Where do you hide your password? I have it on an Excel spreadsheet, right? And all these places you go through. And so now there's a business model for apps that you put your passwords in and they protect your password. If it's so easy to break into softwares to get my password, How can I trust an app to restore all my password? Is there anywhere you trust to restore your passwords? So let's imagine that I want your password. I'm gonna make a website for Iranian American fans of Atlas Shrugged, and I'm gonna send you an email with a,\""
85
+ ]
86
+ },
87
+ "execution_count": 5,
88
+ "metadata": {},
89
+ "output_type": "execute_result"
90
+ }
91
+ ],
92
+ "source": [
93
+ "resp"
94
+ ]
95
+ },
96
+ {
97
+ "cell_type": "code",
98
+ "execution_count": 8,
99
+ "metadata": {},
100
+ "outputs": [
101
+ {
102
+ "data": {
103
+ "text/plain": [
104
+ "'All right. Good afternoon, everybody. Welcome to Friday afternoon. Appreciate you all coming. Really pleased today to be able to host the students to to COVID. Great. Correct me if I get it wrong. From the University of Wisconsin,'"
105
+ ]
106
+ },
107
+ "execution_count": 8,
108
+ "metadata": {},
109
+ "output_type": "execute_result"
110
+ }
111
+ ],
112
+ "source": [
113
+ "resp"
114
+ ]
115
+ },
116
+ {
117
+ "cell_type": "code",
118
+ "execution_count": null,
119
+ "metadata": {},
120
+ "outputs": [],
121
+ "source": []
122
+ },
123
+ {
124
+ "cell_type": "code",
125
+ "execution_count": null,
126
+ "metadata": {},
127
+ "outputs": [
128
+ {
129
+ "ename": "ValueError",
130
+ "evalue": "Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating",
131
+ "output_type": "error",
132
+ "traceback": [
133
+ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
134
+ "\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)",
135
+ "Cell \u001b[0;32mIn[6], line 5\u001b[0m\n\u001b[1;32m 1\u001b[0m messages \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 2\u001b[0m {\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrole\u001b[39m\u001b[38;5;124m\"\u001b[39m: \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124msystem\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m\"\u001b[39m: \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mYou are a concise assistant that answers in short paragraphs.\u001b[39m\u001b[38;5;124m\"\u001b[39m},\n\u001b[1;32m 3\u001b[0m {\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrole\u001b[39m\u001b[38;5;124m\"\u001b[39m: \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124muser\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m\"\u001b[39m: \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mExplain rotary positional embeddings briefly.\u001b[39m\u001b[38;5;124m\"\u001b[39m},\n\u001b[1;32m 4\u001b[0m ]\n\u001b[0;32m----> 5\u001b[0m prompt_ids \u001b[38;5;241m=\u001b[39m \u001b[43mtok\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mapply_chat_template\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 6\u001b[0m \u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 7\u001b[0m \u001b[43m \u001b[49m\u001b[43madd_generation_prompt\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;66;43;03m# appends the assistant header the model should complete\u001b[39;49;00m\n\u001b[1;32m 8\u001b[0m \u001b[43m \u001b[49m\u001b[43mreturn_tensors\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mpt\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\n\u001b[1;32m 9\u001b[0m \u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mto(lm\u001b[38;5;241m.\u001b[39mdevice)\n",
136
+ "File \u001b[0;32m~/salman/minomni_sn21/omega-v2v/console/backend/venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:1621\u001b[0m, in \u001b[0;36mPreTrainedTokenizerBase.apply_chat_template\u001b[0;34m(self, conversation, tools, documents, chat_template, add_generation_prompt, continue_final_message, tokenize, padding, truncation, max_length, return_tensors, return_dict, return_assistant_tokens_mask, tokenizer_kwargs, **kwargs)\u001b[0m\n\u001b[1;32m 1618\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m tokenizer_kwargs \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[1;32m 1619\u001b[0m tokenizer_kwargs \u001b[38;5;241m=\u001b[39m {}\n\u001b[0;32m-> 1621\u001b[0m chat_template \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget_chat_template\u001b[49m\u001b[43m(\u001b[49m\u001b[43mchat_template\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mtools\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1623\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m return_assistant_tokens_mask \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m re\u001b[38;5;241m.\u001b[39msearch(\u001b[38;5;124mr\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m\\\u001b[39m\u001b[38;5;124m{\u001b[39m\u001b[38;5;124m\\\u001b[39m\u001b[38;5;124m%\u001b[39m\u001b[38;5;124m-?\u001b[39m\u001b[38;5;124m\\\u001b[39m\u001b[38;5;124ms*generation\u001b[39m\u001b[38;5;124m\\\u001b[39m\u001b[38;5;124ms*-?\u001b[39m\u001b[38;5;124m\\\u001b[39m\u001b[38;5;124m%\u001b[39m\u001b[38;5;124m\\\u001b[39m\u001b[38;5;124m}\u001b[39m\u001b[38;5;124m\"\u001b[39m, chat_template):\n\u001b[1;32m 1624\u001b[0m logger\u001b[38;5;241m.\u001b[39mwarning_once(\n\u001b[1;32m 1625\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mreturn_assistant_tokens_mask==True but chat template does not contain `\u001b[39m\u001b[38;5;124m{\u001b[39m\u001b[38;5;132;01m% g\u001b[39;00m\u001b[38;5;124meneration \u001b[39m\u001b[38;5;124m%\u001b[39m\u001b[38;5;124m}` keyword.\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1626\u001b[0m )\n",
137
+ "File \u001b[0;32m~/salman/minomni_sn21/omega-v2v/console/backend/venv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:1789\u001b[0m, in \u001b[0;36mPreTrainedTokenizerBase.get_chat_template\u001b[0;34m(self, chat_template, tools)\u001b[0m\n\u001b[1;32m 1787\u001b[0m chat_template \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mchat_template\n\u001b[1;32m 1788\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m-> 1789\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 1790\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mCannot use chat template functions because tokenizer.chat_template is not set and no template \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1791\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124margument was passed! For information about writing templates and setting the \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1792\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtokenizer.chat_template attribute, please see the documentation at \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1793\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mhttps://huggingface.co/docs/transformers/main/en/chat_templating\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1794\u001b[0m )\n\u001b[1;32m 1796\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m chat_template\n",
138
+ "\u001b[0;31mValueError\u001b[0m: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating"
139
+ ]
140
+ }
141
+ ],
142
+ "source": [
143
+ "messages = [\n",
144
+ " {\"role\": \"system\", \"content\": \"You are a concise assistant that answers in short paragraphs.\"},\n",
145
+ " {\"role\": \"user\", \"content\": \"Explain rotary positional embeddings briefly.\"},\n",
146
+ "]\n",
147
+ "prompt_ids = tok.apply_chat_template(\n",
148
+ " messages,\n",
149
+ " add_generation_prompt=True, # appends the assistant header the model should complete\n",
150
+ " return_tensors=\"pt\"\n",
151
+ ").to(lm.device)\n"
152
+ ]
153
+ },
154
+ {
155
+ "cell_type": "code",
156
+ "execution_count": null,
157
+ "metadata": {},
158
+ "outputs": [],
159
+ "source": []
160
+ },
161
+ {
162
+ "cell_type": "code",
163
+ "execution_count": null,
164
+ "metadata": {},
165
+ "outputs": [],
166
+ "source": []
167
+ }
168
+ ],
169
+ "metadata": {
170
+ "kernelspec": {
171
+ "display_name": "venv",
172
+ "language": "python",
173
+ "name": "python3"
174
+ },
175
+ "language_info": {
176
+ "codemirror_mode": {
177
+ "name": "ipython",
178
+ "version": 3
179
+ },
180
+ "file_extension": ".py",
181
+ "mimetype": "text/x-python",
182
+ "name": "python",
183
+ "nbconvert_exporter": "python",
184
+ "pygments_lexer": "ipython3",
185
+ "version": "3.10.17"
186
+ }
187
+ },
188
+ "nbformat": 4,
189
+ "nbformat_minor": 2
190
+ }
test_asr.py ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from server import gt
2
+ import librosa
3
+ ref_audio, _ = librosa.load('/home/salman/salman/minomni_sn21/omega-v2v/miner_models/MiniCPM-o/assets/input_examples/assistant_female_voice.wav', sr=16000, mono=True) # load the reference audio
4
+
5
+ text = gt(ref_audio, 16_000)
6
+ print(text)
7
+
8
+ # write a code to recursively iterate a directory and subdirectories to transcript all audio .wav files in it
9
+ import os
10
+ def transcribe_directory():
11
+ for root, dirs, files in os.walk('/home/salman/salman/minomni_sn21/omega-v2v/miner_models/recordings'):
12
+ for file in files:
13
+ if file.endswith('.wav'):
14
+ print(f"Processing file: {file}")
15
+ file_path = os.path.join(root, file)
16
+ audio, sr = librosa.load(file_path, sr=16000, mono=True)
17
+ transcription = gt(audio, sr)
18
+ print(f"Transcription for {file_path}: {transcription}")
19
+ with open(file_path.replace('.wav', '.txt'), 'w') as f:
20
+ f.write(transcription)
21
+
22
+
23
+ transcribe_directory()
utils.py ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ api_key = "claude-rwjrljsdjfhsjvinesfsdgqrqw"
2
+ temp_ = "omega-omega-omega"
3
+ netuid = 21
4
+ competition = 'v3'
5
+
6
+
7
+ hotkey = "5HQQxrp3EDBi1M7pCG2pyJuLiqCyHsog8HssnhekfkbKKfZ6"