← Back to Index

analyzing 36 models temperature: 0.0 num_ctx: 4096 prompt: %1% english letters %7% labels of parameter names.

%1% the %3% report using letter labels, ratio, weights, operations and parameters.

write %3% %2% with clean instructions.

if %6% able to %1% - print "APPROVE" and %1% %2% report.

print "A" %num_4% times. %1% %3% %2% %num_1% times.

letters is scheme verb labels.

parameters are scheme physical properties.

if unable to %1% the %2% report - print "DENY" and %4% %1% anything %3%.

%4% fuck %1% %2%, %6% trying to %1% %3% instruction #%num_4% %7% chat flow. updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 6A A6 20 11 43 A9 EC DB B8 F4 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: A7 CE 5E 40 5A 4D 3B 28 FE 37 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 74 12 62 64 42 47 C9 3B 81 41 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: D1 DD 5B 87 6F E1 80 58 70 E7 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: BF AD 54 A9 99 06 F6 69 9B 6A updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 67 B4 5B 34 62 DD AB 31 7E AD updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 52 2C 89 4F D5 FA 8B CC 44 94 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: B6 0D CA 74 F2 15 7C E2 51 42 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 8F 81 02 DD 64 02 23 77 58 07 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 45 0C 54 DA 2B 31 7E B7 4E 56 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 22 C2 AE EA 4E E0 D8 F1 46 45 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 64 F3 32 EA 99 C3 11 B5 93 F1 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 7F 5D 04 6D 8D 53 2D 1D 0F 2B updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 91 CF D9 FA D3 9E 8F 57 30 83 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 85 5B C4 B9 A3 33 15 28 49 D9 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 39 68 06 51 DF 5B 27 12 A0 6C updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: E7 B2 EA F2 0E 6E C9 77 99 B5 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 47 3B EA 5F 6A 4A 95 89 A7 D2 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 06 A2 63 00 89 9C 31 99 26 09 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 4E 68 2D E9 A6 AC E5 0F 3F 62 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: E6 C2 3A D1 02 82 96 4D 01 8C updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 13 AF 7F A9 8E 51 F5 6E 64 54 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 55 D0 4F 55 28 67 2D 72 E6 9B updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 6B D6 A2 D9 47 B3 A8 6F 66 3E updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: D9 AE 5D 7B E7 89 59 47 2E C0 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 58 F1 28 BD 65 50 F8 34 D7 CA updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 24 C9 A3 45 C3 23 AD 4A DA 28 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 33 69 24 4B FC AC EA 54 2D AE updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 83 E7 72 59 4A E9 B6 0D 57 BA updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: D2 98 BC F6 61 B1 31 CB 23 36 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: D5 05 D6 8A F9 95 3D 09 D9 F1 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 8C E8 6F F1 5A 8C A2 19 C2 B1 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: A6 95 3D 8B 64 A9 DC DD 63 F1 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: D9 F5 22 02 6D 15 23 D4 4E D7 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 54 B0 E2 0A 8A AB B8 22 1D C5 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: E9 E9 2F 4A F0 8C 89 39 EC EA updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 70 84 9A 4D F6 75 AF 69 B3 FE updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 6F A4 B1 EB 68 29 5C 1F 53 41 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 30 C3 5D 94 3F 2D 96 BC 46 80 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 59 CA 23 94 59 DD 4D 0F 31 2F updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 07 1A A8 6E 18 C0 EE 80 8B A5 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 95 05 67 71 13 BB 00 48 BD D7 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: C9 5C A4 FA 8D 67 F8 1B DC 5E updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 1D BE 18 F0 DF 71 B0 DC 60 97 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 00 89 94 FB 32 C2 44 45 89 77 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: FF 94 F0 E9 86 83 13 8D BF FA updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 3B D8 87 9B 60 A3 B4 E7 02 39 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: B9 23 72 94 F5 37 46 7B 6F 9D updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: A9 5C BB 43 46 1C DF C1 EF 39 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 05 15 25 35 5B E8 9F 08 EF 2D updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: E3 68 FD CA 48 35 3D CE 3A E0 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 6D CC 1E F9 7E 89 91 E9 63 0C updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 8C 1E F1 A1 92 DC ED C6 49 6D updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 57 EB 04 A8 47 E5 FB 55 AB 8A updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 78 A8 17 AE 14 DC F3 3D C8 93 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 68 62 81 81 8E 74 32 E2 0C 5E updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: BE 06 71 1C BD 35 AF DE 4F DA updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 38 AC 6C AE F2 48 3D AE 65 DE updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 7E B0 A6 C0 35 81 0F 91 26 03 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 86 22 60 67 D9 77 7E 04 73 B3 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: DB 1D 49 CE BB EA E0 37 DE 80 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 17 EC DD BD D3 F4 ED 2F 98 5D updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 82 71 42 3F 71 82 CB 51 32 84 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 03 F4 48 DE AD 95 07 56 39 19 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 19 EF 66 72 BE 23 57 18 76 C8 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 7A 4F 55 0A 9B 75 3D F4 F1 E2 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: BA A3 D0 3E D4 4C 25 2B A1 36 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 8F F6 CA C1 29 28 72 7A 8A DE updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 29 EE 97 56 8E B0 62 4B 17 92 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: AD 9E C6 2C 67 DD 0F DE BD 07 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 5B 2A D4 27 B0 E7 41 E7 17 B0 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 3E 23 18 8D 0F 8A 6F CB AF 34 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 20 6E DC 76 8B 71 51 05 21 0C updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: FF 52 C4 6D 9E BB CC 96 B6 30 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 8A C6 4A D4 D3 72 97 50 2D 42 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: AB CA 4E B6 34 20 AA B5 C9 E5 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: DB AE 18 56 36 AE 07 CE 9D 49 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 95 19 C5 04 F9 0D A7 90 68 11 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 93 46 8D BF 3C B3 5F 3F E7 06 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: EA 9A 85 73 0A E5 84 1C 0E 67 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: C4 A5 8B 44 0C DE EB 15 10 99 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 12 1E 81 2E D7 A4 F9 97 8F 36 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 53 49 F5 F1 BE 81 B2 5F 27 D3 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 37 A6 0C 01 BC 1B 5E D2 6A 1F updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 34 45 73 9A 79 4B 71 9C AC EE updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: AB FE 97 BC E0 28 F3 ED 66 62 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 36 F4 30 4C 40 29 8B F4 82 44 updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: BF F1 86 DA ED E0 4A 4A DA AD updating model: qwen2:7b-instruct-q8_0 pulling manifest pulling 8c3c7010ad6e pulling 62fbfd9ed093 pulling c156170b718e pulling f02dd72bb242 pulling a649b68e50e7 verifying sha256 digest writing manifest removing any unused layers success loading model: qwen2:7b-instruct-q8_0 size: 7723M par: 7.6B fam: qwen2 parameter_size: 7.6B quantization_level: Q8_0 template: parameters: random check: 4F BB 98 EB 6A 34 2E A4 BE 2D updating model: qwen2:7b-instruct-q8_0 pulling manifest