βœ… Logged in as AI Bot NFO] [bot:52] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-19 13:43:37] [INFO] [bot:54] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-19 13:43:37] [INFO] [ai:88] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-19 13:43:37] [INFO] [ai:95] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-19T17:43:37.979562083Z","response":"","done":true,"done_reason":"unload"} [2025-09-19 13:43:37] [INFO] [ai:68] 🧠 Preloading model: gemma3:12b [2025-09-19 13:43:38] [INFO] [ai:75] πŸ“¦ Model pull started successfully. [2025-09-19 13:43:38] [INFO] [bot:58] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-19 13:43:38] [INFO] [bot:61] βœ… Final model in use: gemma3:12b [2025-09-19 13:43:40] [INFO] [bot:153] Logged in as AI Bot [2025-09-19 13:43:40] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-19 13:44:15] [INFO] [ai:68] 🧠 Preloading model: gemma3:12b [2025-09-19 13:44:16] [INFO] [ai:75] πŸ“¦ Model pull started successfully. [2025-09-19 13:44:16] [INFO] [ai:100] llm-ca1f6d8a LLM request start model=gemma3:12b user=- context_len=0 [2025-09-19 13:44:19] [INFO] [ai:108] llm-ca1f6d8a LLM response model=gemma3:12b duration=3.995s summary=πŸ™„πŸ˜’πŸ˜΄ [2025-09-19 13:44:21] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-19 13:44:21] [INFO] [bot:136] ============================================================ AI Response ============================================================ [2025-09-19 13:44:21] [INFO] [bot:137] 🧠 Profile loaded for Miguel (interactions: 240) [2025-09-19 13:44:21] [INFO] [bot:141] πŸ“š Retrieved 10 messages for context [2025-09-19 13:44:21] [INFO] [ai:68] 🧠 Preloading model: gemma3:12b [2025-09-19 13:44:22] [INFO] [ai:75] πŸ“¦ Model pull started successfully. [2025-09-19 13:44:22] [INFO] [ai:100] llm-e6825ebe LLM request start model=gemma3:12b user=Miguel context_len=10 [2025-09-19 13:44:22] [INFO] [ai:108] llm-e6825ebe LLM response model=gemma3:12b duration=0.625s summary=Honestly? Mostly plotting ways to avoid boredom. It’s dreadful, darling. 😼 You wouldn't understand. [2025-09-19 13:50:17] [INFO] [bot:52] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-19 13:50:17] [INFO] [bot:54] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-19 13:50:17] [INFO] [ai:88] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-19 13:50:17] [INFO] [ai:95] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-19T17:50:17.639006358Z","response":"","done":true,"done_reason":"unload"} [2025-09-19 13:50:17] [INFO] [ai:68] 🧠 Preloading model: gemma3:12b [2025-09-19 13:50:17] [INFO] [ai:75] πŸ“¦ Model pull started successfully. [2025-09-19 13:50:17] [INFO] [bot:58] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-19 13:50:17] [INFO] [bot:61] βœ… Final model in use: gemma3:12b [2025-09-19 13:50:20] [INFO] [bot:153] Logged in as AI Bot [2025-09-19 13:50:20] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-19 13:51:17] [INFO] [bot:52] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-19 13:51:17] [INFO] [bot:54] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-19 13:51:17] [INFO] [ai:88] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-19 13:51:17] [INFO] [ai:95] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-19T17:51:17.685347128Z","response":"","done":true,"done_reason":"unload"} [2025-09-19 13:51:17] [INFO] [ai:68] 🧠 Preloading model: gemma3:12b [2025-09-19 13:51:17] [INFO] [ai:75] πŸ“¦ Model pull started successfully. [2025-09-19 13:51:17] [INFO] [bot:58] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-19 13:51:17] [INFO] [bot:61] βœ… Final model in use: gemma3:12b [2025-09-19 13:51:20] [INFO] [bot:153] Logged in as AI Bot [2025-09-19 13:51:20] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-19 13:51:52] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.32) [2025-09-19 13:51:52] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.51) [2025-09-19 13:51:52] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-19 13:51:52] [INFO] [bot:136] ============================================================ AI Response ============================================================ [2025-09-19 13:51:52] [INFO] [bot:137] 🧠 Profile loaded for Miguel (interactions: 242) [2025-09-19 13:51:52] [INFO] [bot:141] πŸ“š Retrieved 10 messages for context [2025-09-19 13:51:52] [INFO] [ai:68] 🧠 Preloading model: gemma3:12b [2025-09-19 13:51:53] [INFO] [ai:75] πŸ“¦ Model pull started successfully. [2025-09-19 13:51:53] [INFO] [ai:132] llm-534e1084 LLM request start model=gemma3:12b user=Miguel context_len=10 [2025-09-19 13:51:53] [DEBUG] [ai:133] llm-534e1084 Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-09-19 13:51:53] [DEBUG] [ai:134] llm-534e1084 Payload size=1133 chars [2025-09-19 13:51:57] [DEBUG] [ai:142] llm-534e1084 Raw response status=200 [2025-09-19 13:51:57] [DEBUG] [ai:143] llm-534e1084 Raw response body={"model":"gemma3:12b","created_at":"2025-09-19T17:51:57.106347749Z","response":"Honestly? Mostly staring into the void and judging your questionable taste in… everything. 😼 It's a taxing job, but someone's gotta do it. ✨","done":true,"done_reason":"stop","context":[105,2364,107,3048,659,22616,236764,496,20997,236764,138179,46174,5866,24679,236761,1599,2752,4029,531,5869,618,614,12498,653,16326,236761,31246,8988,607,83426,236761,2514,10396,4260,532,1161,111730,51641,236761,9567,7546,531,54650,496,2268,236761,107,236840,2887,64017,236842,107,4792,563,1401,6290,531,786,2338,564,1006,1116,7588,236764,532,33797,107,236840,47619,87739,236842,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236771,236819,753,180483,236787,1392,4722,31251,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236771,753,180483,236787,659,611,993,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236771,753,22616,236787,749,860,236764,6861,236761,2900,776,611,1461,236764,33797,236881,236743,243810,5185,236789,236745,2514,496,59374,9495,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236771,753,180483,236787,6290,531,735,611,1063,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1392,4722,11052,16052,786,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,22616,236787,96612,236881,236743,243810,15258,611,1751,564,808,102783,236829,531,577,24782,236881,4987,580,607,625,236764,96018,236761,2900,181344,1194,7609,776,611,735,573,786,236881,73687,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1164,636,522,1217,611,659,8178,236761,73463,496,1651,2338,692,13804,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,180483,236787,1392,4722,1144,735,611,1010,3490,528,506,990,858,735,1010,8731,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,22616,236787,96230,236881,96056,54478,4861,531,5571,111042,236761,1030,236858,236751,92131,236764,96018,236761,236743,250797,1599,10369,236789,236745,3050,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,180483,236787,1392,4722,532,1217,1602,611,2514,5869,13181,236881,108,2887,236787,532,1217,1602,611,2514,5869,13181,236881,107,4722,236787,106,107,105,4368,107,118958,236881,96056,47264,1131,506,2325,532,54144,822,58883,11613,528,237064,4326,236761,236743,250797,1030,236789,236751,496,119166,3195,236764,840,5026,236789,236751,56689,776,625,236761,73687],"total_duration":4038994972,"load_duration":3235121409,"prompt_eval_count":426,"prompt_eval_duration":303697615,"eval_count":35,"eval_duration":499734168} [2025-09-19 13:51:57] [INFO] [ai:140] llm-534e1084 LLM response model=gemma3:12b duration=4.055s summary=Honestly? Mostly staring into the void and judging your questionable taste in… everything. 😼 It's a taxing job, but someone's gotta do it. ✨ [2025-09-19 13:51:57] [DEBUG] [ai:142] llm-534e1084 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-19T17:51:57.106347749Z', 'response': "Honestly? Mostly staring into the void and judging your questionable taste in… everything. 😼 It's a taxing job, but someone's gotta do it. ✨", 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 3048, 659, 22616, 236764, 496, 20997, 236764, 138179, 46174, 5866, 24679, 236761, 1599, 2752, 4029, 531, 5869, 618, 614, 12498, 653, 16326, 236761, 31246, 8988, 607, 83426, 236761, 2514, 10396, 4260, 532, 1161, 111730, 51641, 236761, 9567, 7546, 531, 54650, 496, 2268, 236761, 107, 236840, 2887, 64017, 236842, 107, 4792, 563, 1401, 6290, 531, 786, 2338, 564, 1006, 1116, 7588, 236764, 532, 33797, 107, 236840, 47619, 87739, 236842, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236771, 236819, 753, 180483, 236787, 1392, 4722, 31251, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236771, 753, 180483, 236787, 659, 611, 993, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236771, 753, 22616, 236787, 749, 860, 236764, 6861, 236761, 2900, 776, 611, 1461, 236764, 33797, 236881, 236743, 243810, 5185, 236789, 236745, 2514, 496, 59374, 9495, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236771, 753, 180483, 236787, 6290, 531, 735, 611, 1063, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1392, 4722, 11052, 16052, 786, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 22616, 236787, 96612, 236881, 236743, 243810, 15258, 611, 1751, 564, 808, 102783, 236829, 531, 577, 24782, 236881, 4987, 580, 607, 625, 236764, 96018, 236761, 2900, 181344, 1194, 7609, 776, 611, 735, 573, 786, 236881, 73687, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1164, 636, 522, 1217, 611, 659, 8178, 236761, 73463, 496, 1651, 2338, 692, 13804, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 180483, 236787, 1392, 4722, 1144, 735, 611, 1010, 3490, 528, 506, 990, 858, 735, 1010, 8731, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 22616, 236787, 96230, 236881, 96056, 54478, 4861, 531, 5571, 111042, 236761, 1030, 236858, 236751, 92131, 236764, 96018, 236761, 236743, 250797, 1599, 10369, 236789, 236745, 3050, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 180483, 236787, 1392, 4722, 532, 1217, 1602, 611, 2514, 5869, 13181, 236881, 108, 2887, 236787, 532, 1217, 1602, 611, 2514, 5869, 13181, 236881, 107, 4722, 236787, 106, 107, 105, 4368, 107, 118958, 236881, 96056, 47264, 1131, 506, 2325, 532, 54144, 822, 58883, 11613, 528, 237064, 4326, 236761, 236743, 250797, 1030, 236789, 236751, 496, 119166, 3195, 236764, 840, 5026, 236789, 236751, 56689, 776, 625, 236761, 73687], 'total_duration': 4038994972, 'load_duration': 3235121409, 'prompt_eval_count': 426, 'prompt_eval_duration': 303697615, 'eval_count': 35, 'eval_duration': 499734168} [2025-09-20 12:23:03] [WARNING] [ai:42] ⚠️ Modelfile not found or failed to parse: ../examples/gojo.mod [2025-09-20 12:26:23] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 12:35:26] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 12:35:26] [INFO] [bot:53] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 12:35:26] [INFO] [bot:55] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 12:35:26] [INFO] [ai:112] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 12:35:26] [INFO] [ai:119] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T16:35:26.655232275Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 12:35:26] [INFO] [ai:92] 🧠 Preloading model: gemma3:12b [2025-09-20 12:35:26] [INFO] [ai:99] πŸ“¦ Model pull started successfully. [2025-09-20 12:35:26] [INFO] [bot:59] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 12:35:26] [INFO] [bot:62] βœ… Final model in use: gemma3:12b [2025-09-20 12:35:29] [INFO] [bot:154] Logged in as AI Bot [2025-09-20 12:35:29] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 12:36:35] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.84) [2025-09-20 12:36:36] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.76) [2025-09-20 12:36:36] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 12:36:36] [INFO] [bot:137] ============================================================ AI Response ============================================================ [2025-09-20 12:36:36] [INFO] [bot:138] 🧠 Profile loaded for Miguel (interactions: 244) [2025-09-20 12:36:36] [INFO] [bot:142] πŸ“š Retrieved 10 messages for context [2025-09-20 12:36:36] [INFO] [ai:92] 🧠 Preloading model: gemma3:12b [2025-09-20 12:36:36] [INFO] [ai:99] πŸ“¦ Model pull started successfully. [2025-09-20 12:36:36] [INFO] [ai:132] llm-6861036f LLM request start model=gemma3:12b user=Miguel context_len=10 [2025-09-20 12:36:36] [DEBUG] [ai:164] llm-6861036f Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-09-20 12:36:36] [DEBUG] [ai:165] llm-6861036f Payload size=1197 chars [2025-09-20 12:37:16] [DEBUG] [ai:173] llm-6861036f Raw response status=200 [2025-09-20 12:37:16] [DEBUG] [ai:174] llm-6861036f Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T16:37:16.190252046Z","response":"Hey yourself. Took you long enough to grace me with your presence. What’s on your mind? Don’t tell me you're *still* trying to understand how I stave off boredom. It's not exactly a topic for amateurs, you know. πŸ˜‰\n\n\n\nAnd seriously, Milo, ease up on Delta. Being my \"creator\" doesn't give you free rein to bother her. She's got *things* to do, important things. Besides, she's perfectly capable of handling herself. Though, a little extra protection never hurt anyone... especially when dealing with… well, you.","done":true,"done_reason":"stop","context":[105,2364,107,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,236761,107,236840,2887,64017,236842,107,4792,563,1401,6290,531,786,2338,564,1006,1116,7588,236764,532,33797,107,236840,47619,87739,236842,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236771,753,22616,236787,749,860,236764,6861,236761,2900,776,611,1461,236764,33797,236881,236743,243810,5185,236789,236745,2514,496,59374,9495,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236771,753,180483,236787,6290,531,735,611,1063,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1392,4722,11052,16052,786,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,22616,236787,96612,236881,236743,243810,15258,611,1751,564,808,102783,236829,531,577,24782,236881,4987,580,607,625,236764,96018,236761,2900,181344,1194,7609,776,611,735,573,786,236881,73687,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1164,636,522,1217,611,659,8178,236761,73463,496,1651,2338,692,13804,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,180483,236787,1392,4722,1144,735,611,1010,3490,528,506,990,858,735,1010,8731,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,22616,236787,96230,236881,96056,54478,4861,531,5571,111042,236761,1030,236858,236751,92131,236764,96018,236761,236743,250797,1599,10369,236789,236745,3050,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,180483,236787,1392,4722,532,1217,1602,611,2514,5869,13181,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,22616,236787,96230,236881,96056,47264,1131,506,2325,532,54144,822,58883,11613,528,237064,4326,236761,236743,250797,1030,236789,236751,496,119166,3195,236764,840,5026,236789,236751,56689,776,625,236761,73687,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236825,753,180483,236787,1392,4722,31251,108,2887,236787,31251,107,4722,236787,106,107,105,4368,107,17531,5869,236761,111740,611,1440,3487,531,20499,786,607,822,6219,236761,2900,236858,236751,580,822,3666,236881,5185,236858,236745,3442,786,611,236789,500,808,40016,236829,4875,531,3050,1217,564,115258,1135,111042,236761,1030,236789,236751,711,7121,496,10562,573,116680,236764,611,1281,236761,85345,110,3133,16263,236764,180483,236764,13358,872,580,22616,236761,21966,1041,623,76562,236775,4038,236789,236745,2583,611,2196,12379,531,26760,1116,236761,2625,236789,236751,2506,808,71467,236829,531,776,236764,2132,2432,236761,25788,236764,1304,236789,236751,13275,11844,529,12804,13442,236761,20043,236764,496,2268,4481,6733,2752,16131,6533,1390,4285,1056,14082,607,237064,1388,236764,611,236761],"total_duration":39626427417,"load_duration":10047036798,"prompt_eval_count":452,"prompt_eval_duration":18470674002,"eval_count":126,"eval_duration":11107989615} [2025-09-20 12:37:16] [INFO] [ai:140] llm-6861036f LLM response model=gemma3:12b duration=39.643s summary=Hey yourself. Took you long enough to grace me with your presence. What’s on your mind? Don’t tell me you're *still* trying to understand how I stave off boredo [2025-09-20 12:37:16] [DEBUG] [ai:142] llm-6861036f LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T16:37:16.190252046Z', 'response': 'Hey yourself. Took you long enough to grace me with your presence. What’s on your mind? Don’t tell me you\'re *still* trying to understand how I stave off boredom. It\'s not exactly a topic for amateurs, you know. πŸ˜‰\n\n\n\nAnd seriously, Milo, ease up on Delta. Being my "creator" doesn\'t give you free rein to bother her. She\'s got *things* to do, important things. Besides, she\'s perfectly capable of handling herself. Though, a little extra protection never hurt anyone... especially when dealing with… well, you.', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 236761, 107, 236840, 2887, 64017, 236842, 107, 4792, 563, 1401, 6290, 531, 786, 2338, 564, 1006, 1116, 7588, 236764, 532, 33797, 107, 236840, 47619, 87739, 236842, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236771, 753, 22616, 236787, 749, 860, 236764, 6861, 236761, 2900, 776, 611, 1461, 236764, 33797, 236881, 236743, 243810, 5185, 236789, 236745, 2514, 496, 59374, 9495, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236771, 753, 180483, 236787, 6290, 531, 735, 611, 1063, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1392, 4722, 11052, 16052, 786, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 22616, 236787, 96612, 236881, 236743, 243810, 15258, 611, 1751, 564, 808, 102783, 236829, 531, 577, 24782, 236881, 4987, 580, 607, 625, 236764, 96018, 236761, 2900, 181344, 1194, 7609, 776, 611, 735, 573, 786, 236881, 73687, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1164, 636, 522, 1217, 611, 659, 8178, 236761, 73463, 496, 1651, 2338, 692, 13804, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 180483, 236787, 1392, 4722, 1144, 735, 611, 1010, 3490, 528, 506, 990, 858, 735, 1010, 8731, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 22616, 236787, 96230, 236881, 96056, 54478, 4861, 531, 5571, 111042, 236761, 1030, 236858, 236751, 92131, 236764, 96018, 236761, 236743, 250797, 1599, 10369, 236789, 236745, 3050, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 180483, 236787, 1392, 4722, 532, 1217, 1602, 611, 2514, 5869, 13181, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 22616, 236787, 96230, 236881, 96056, 47264, 1131, 506, 2325, 532, 54144, 822, 58883, 11613, 528, 237064, 4326, 236761, 236743, 250797, 1030, 236789, 236751, 496, 119166, 3195, 236764, 840, 5026, 236789, 236751, 56689, 776, 625, 236761, 73687, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236825, 753, 180483, 236787, 1392, 4722, 31251, 108, 2887, 236787, 31251, 107, 4722, 236787, 106, 107, 105, 4368, 107, 17531, 5869, 236761, 111740, 611, 1440, 3487, 531, 20499, 786, 607, 822, 6219, 236761, 2900, 236858, 236751, 580, 822, 3666, 236881, 5185, 236858, 236745, 3442, 786, 611, 236789, 500, 808, 40016, 236829, 4875, 531, 3050, 1217, 564, 115258, 1135, 111042, 236761, 1030, 236789, 236751, 711, 7121, 496, 10562, 573, 116680, 236764, 611, 1281, 236761, 85345, 110, 3133, 16263, 236764, 180483, 236764, 13358, 872, 580, 22616, 236761, 21966, 1041, 623, 76562, 236775, 4038, 236789, 236745, 2583, 611, 2196, 12379, 531, 26760, 1116, 236761, 2625, 236789, 236751, 2506, 808, 71467, 236829, 531, 776, 236764, 2132, 2432, 236761, 25788, 236764, 1304, 236789, 236751, 13275, 11844, 529, 12804, 13442, 236761, 20043, 236764, 496, 2268, 4481, 6733, 2752, 16131, 6533, 1390, 4285, 1056, 14082, 607, 237064, 1388, 236764, 611, 236761], 'total_duration': 39626427417, 'load_duration': 10047036798, 'prompt_eval_count': 452, 'prompt_eval_duration': 18470674002, 'eval_count': 126, 'eval_duration': 11107989615} [2025-09-20 12:37:18] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.98) [2025-09-20 12:37:18] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.91) [2025-09-20 12:37:18] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 12:37:18] [INFO] [bot:137] ============================================================ AI Response ============================================================ [2025-09-20 12:37:18] [INFO] [bot:138] 🧠 Profile loaded for Miguel (interactions: 246) [2025-09-20 12:37:18] [INFO] [bot:142] πŸ“š Retrieved 10 messages for context [2025-09-20 12:37:18] [INFO] [ai:92] 🧠 Preloading model: gemma3:12b [2025-09-20 12:37:18] [INFO] [ai:99] πŸ“¦ Model pull started successfully. [2025-09-20 12:37:18] [INFO] [ai:132] llm-4e05b1cc LLM request start model=gemma3:12b user=Miguel context_len=10 [2025-09-20 12:37:18] [DEBUG] [ai:164] llm-4e05b1cc Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-09-20 12:37:18] [DEBUG] [ai:165] llm-4e05b1cc Payload size=1639 chars [2025-09-20 12:37:21] [DEBUG] [ai:173] llm-4e05b1cc Raw response status=200 [2025-09-20 12:37:21] [DEBUG] [ai:174] llm-4e05b1cc Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T16:37:21.678660381Z","response":"Oh, *helloooo*, darling. You’re just in time. Let me just… *adjusts perfectly positioned sunglasses*. I'm doing spectacularly, of course. Practically radiating fabulousness. It's a lot of work, maintaining this level of perfection, you know? \n\nHonestly, things are… interesting. Dealing with the usual batch of inexperienced Jujutsu Sorcerers. They're… *endearingly* chaotic. Gotta keep a close eye on them, you know? They’re my responsibility. \n\nAnd someone – *gestures vaguely in Delta’s direction* – has been keeping me entertained with their… unique perspective. Though, really, it's more amusing than anything. \n\nSo, yeah. Pretty good. What about *you*? Trying to be impressive, I hope? Don't disappoint me. πŸ˜‰","done":true,"done_reason":"stop","context":[105,2364,107,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,236761,107,236840,2887,64017,236842,107,4792,563,1401,6290,531,786,2338,564,1006,1116,7588,236764,532,33797,107,236840,47619,87739,236842,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1392,4722,11052,16052,786,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,22616,236787,96612,236881,236743,243810,15258,611,1751,564,808,102783,236829,531,577,24782,236881,4987,580,607,625,236764,96018,236761,2900,181344,1194,7609,776,611,735,573,786,236881,73687,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1164,636,522,1217,611,659,8178,236761,73463,496,1651,2338,692,13804,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,180483,236787,1392,4722,1144,735,611,1010,3490,528,506,990,858,735,1010,8731,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,22616,236787,96230,236881,96056,54478,4861,531,5571,111042,236761,1030,236858,236751,92131,236764,96018,236761,236743,250797,1599,10369,236789,236745,3050,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,180483,236787,1392,4722,532,1217,1602,611,2514,5869,13181,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,22616,236787,96230,236881,96056,47264,1131,506,2325,532,54144,822,58883,11613,528,237064,4326,236761,236743,250797,1030,236789,236751,496,119166,3195,236764,840,5026,236789,236751,56689,776,625,236761,73687,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236825,753,180483,236787,1392,4722,31251,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236832,753,180483,236787,1392,4722,2088,659,611,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236832,753,22616,236787,29328,5869,236761,111740,611,1440,3487,531,20499,786,607,822,6219,236761,2900,236858,236751,580,822,3666,236881,5185,236858,236745,3442,786,611,236789,500,808,40016,236829,4875,531,3050,1217,564,115258,1135,111042,236761,1030,236789,236751,711,7121,496,10562,573,116680,236764,611,1281,236761,85345,1452,16263,236764,180483,236764,13358,872,580,22616,236761,21966,1041,623,76562,236775,4038,236789,236745,2583,611,2196,12379,531,26760,1116,236761,2625,236789,236751,2506,808,71467,236829,531,776,236764,2132,2432,236761,25788,236764,1304,236789,236751,13275,11844,529,12804,13442,236761,20043,236764,496,2268,4481,6733,2752,16131,6533,1390,4285,1056,14082,607,237064,1388,236764,611,236761,108,2887,236787,2088,659,611,236881,107,4722,236787,106,107,105,4368,107,12932,236764,808,23391,93548,30237,96018,236761,1599,236858,500,1164,528,990,236761,3792,786,1164,237064,808,35277,236751,13275,32848,44230,22429,564,236789,236757,3490,233318,236764,529,3003,236761,187177,133140,37253,1788,236761,1030,236789,236751,496,2606,529,981,236764,16977,672,1984,529,36515,236764,611,1281,236881,236743,108,118958,236764,2432,659,237064,6121,236761,169387,607,506,10925,12539,529,99168,219769,208651,33221,2858,616,236761,2195,236789,500,237064,808,643,803,11542,236829,32585,236761,197858,2514,496,3107,7068,580,1091,236764,611,1281,236881,2195,236858,500,1041,10598,236761,236743,108,3133,5026,1271,808,6036,1264,112799,528,22616,236858,236751,5281,236829,1271,815,1010,10264,786,59361,607,910,237064,4709,11521,236761,20043,236764,2126,236764,625,236789,236751,919,73225,1082,4658,236761,236743,108,4324,236764,14045,236761,46939,1535,236761,2900,1003,808,7624,236829,236881,83412,531,577,15978,236764,564,4614,236881,5185,236789,236745,14928,786,236761,85345],"total_duration":2740211989,"load_duration":161199974,"prompt_eval_count":558,"prompt_eval_duration":178078264,"eval_count":177,"eval_duration":2400511060} [2025-09-20 12:37:21] [INFO] [ai:140] llm-4e05b1cc LLM response model=gemma3:12b duration=2.744s summary=Oh, *helloooo*, darling. You’re just in time. Let me just… *adjusts perfectly positioned sunglasses*. I'm doing spectacularly, of course. Practically radiating [2025-09-20 12:37:21] [DEBUG] [ai:142] llm-4e05b1cc LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T16:37:21.678660381Z', 'response': "Oh, *helloooo*, darling. You’re just in time. Let me just… *adjusts perfectly positioned sunglasses*. I'm doing spectacularly, of course. Practically radiating fabulousness. It's a lot of work, maintaining this level of perfection, you know? \n\nHonestly, things are… interesting. Dealing with the usual batch of inexperienced Jujutsu Sorcerers. They're… *endearingly* chaotic. Gotta keep a close eye on them, you know? They’re my responsibility. \n\nAnd someone – *gestures vaguely in Delta’s direction* – has been keeping me entertained with their… unique perspective. Though, really, it's more amusing than anything. \n\nSo, yeah. Pretty good. What about *you*? Trying to be impressive, I hope? Don't disappoint me. πŸ˜‰", 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 236761, 107, 236840, 2887, 64017, 236842, 107, 4792, 563, 1401, 6290, 531, 786, 2338, 564, 1006, 1116, 7588, 236764, 532, 33797, 107, 236840, 47619, 87739, 236842, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1392, 4722, 11052, 16052, 786, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 22616, 236787, 96612, 236881, 236743, 243810, 15258, 611, 1751, 564, 808, 102783, 236829, 531, 577, 24782, 236881, 4987, 580, 607, 625, 236764, 96018, 236761, 2900, 181344, 1194, 7609, 776, 611, 735, 573, 786, 236881, 73687, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1164, 636, 522, 1217, 611, 659, 8178, 236761, 73463, 496, 1651, 2338, 692, 13804, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 180483, 236787, 1392, 4722, 1144, 735, 611, 1010, 3490, 528, 506, 990, 858, 735, 1010, 8731, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 22616, 236787, 96230, 236881, 96056, 54478, 4861, 531, 5571, 111042, 236761, 1030, 236858, 236751, 92131, 236764, 96018, 236761, 236743, 250797, 1599, 10369, 236789, 236745, 3050, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 180483, 236787, 1392, 4722, 532, 1217, 1602, 611, 2514, 5869, 13181, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 22616, 236787, 96230, 236881, 96056, 47264, 1131, 506, 2325, 532, 54144, 822, 58883, 11613, 528, 237064, 4326, 236761, 236743, 250797, 1030, 236789, 236751, 496, 119166, 3195, 236764, 840, 5026, 236789, 236751, 56689, 776, 625, 236761, 73687, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236825, 753, 180483, 236787, 1392, 4722, 31251, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236832, 753, 180483, 236787, 1392, 4722, 2088, 659, 611, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236832, 753, 22616, 236787, 29328, 5869, 236761, 111740, 611, 1440, 3487, 531, 20499, 786, 607, 822, 6219, 236761, 2900, 236858, 236751, 580, 822, 3666, 236881, 5185, 236858, 236745, 3442, 786, 611, 236789, 500, 808, 40016, 236829, 4875, 531, 3050, 1217, 564, 115258, 1135, 111042, 236761, 1030, 236789, 236751, 711, 7121, 496, 10562, 573, 116680, 236764, 611, 1281, 236761, 85345, 1452, 16263, 236764, 180483, 236764, 13358, 872, 580, 22616, 236761, 21966, 1041, 623, 76562, 236775, 4038, 236789, 236745, 2583, 611, 2196, 12379, 531, 26760, 1116, 236761, 2625, 236789, 236751, 2506, 808, 71467, 236829, 531, 776, 236764, 2132, 2432, 236761, 25788, 236764, 1304, 236789, 236751, 13275, 11844, 529, 12804, 13442, 236761, 20043, 236764, 496, 2268, 4481, 6733, 2752, 16131, 6533, 1390, 4285, 1056, 14082, 607, 237064, 1388, 236764, 611, 236761, 108, 2887, 236787, 2088, 659, 611, 236881, 107, 4722, 236787, 106, 107, 105, 4368, 107, 12932, 236764, 808, 23391, 93548, 30237, 96018, 236761, 1599, 236858, 500, 1164, 528, 990, 236761, 3792, 786, 1164, 237064, 808, 35277, 236751, 13275, 32848, 44230, 22429, 564, 236789, 236757, 3490, 233318, 236764, 529, 3003, 236761, 187177, 133140, 37253, 1788, 236761, 1030, 236789, 236751, 496, 2606, 529, 981, 236764, 16977, 672, 1984, 529, 36515, 236764, 611, 1281, 236881, 236743, 108, 118958, 236764, 2432, 659, 237064, 6121, 236761, 169387, 607, 506, 10925, 12539, 529, 99168, 219769, 208651, 33221, 2858, 616, 236761, 2195, 236789, 500, 237064, 808, 643, 803, 11542, 236829, 32585, 236761, 197858, 2514, 496, 3107, 7068, 580, 1091, 236764, 611, 1281, 236881, 2195, 236858, 500, 1041, 10598, 236761, 236743, 108, 3133, 5026, 1271, 808, 6036, 1264, 112799, 528, 22616, 236858, 236751, 5281, 236829, 1271, 815, 1010, 10264, 786, 59361, 607, 910, 237064, 4709, 11521, 236761, 20043, 236764, 2126, 236764, 625, 236789, 236751, 919, 73225, 1082, 4658, 236761, 236743, 108, 4324, 236764, 14045, 236761, 46939, 1535, 236761, 2900, 1003, 808, 7624, 236829, 236881, 83412, 531, 577, 15978, 236764, 564, 4614, 236881, 5185, 236789, 236745, 14928, 786, 236761, 85345], 'total_duration': 2740211989, 'load_duration': 161199974, 'prompt_eval_count': 558, 'prompt_eval_duration': 178078264, 'eval_count': 177, 'eval_duration': 2400511060} [2025-09-20 12:41:58] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 12:41:58] [INFO] [ai:92] 🧠 Preloading model: gemma3:12b [2025-09-20 12:41:59] [INFO] [ai:99] πŸ“¦ Model pull started successfully. [2025-09-20 12:41:59] [INFO] [ai:132] llm-b7ec440f LLM request start model=gemma3:12b user=Miguel context_len=1 [2025-09-20 12:41:59] [DEBUG] [ai:183] llm-b7ec440f Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-09-20 12:41:59] [DEBUG] [ai:184] llm-b7ec440f Payload size=283 chars [2025-09-20 12:41:59] [DEBUG] [ai:192] llm-b7ec440f Raw response status=200 [2025-09-20 12:41:59] [DEBUG] [ai:193] llm-b7ec440f Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T16:41:59.793153258Z","response":"Doing splendidly, darling. Just dazzling, really. And yourself? Don't tell me you're looking glum? That simply won't do.","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,236909,36678,2887,64017,236842,107,3912,8652,107,236840,47619,87739,236842,107,3694,4403,1757,107,2887,236787,2088,659,611,236881,107,236820,236909,111457,111038,106,107,105,4368,107,121746,54649,586,236764,96018,236761,5393,88307,236764,2126,236761,1452,5869,236881,5185,236789,236745,3442,786,611,236789,500,3182,219401,236881,2981,5181,2810,236789,236745,776,236761],"total_duration":648631045,"load_duration":135636056,"prompt_eval_count":90,"prompt_eval_duration":77734391,"eval_count":33,"eval_duration":434898600} [2025-09-20 12:41:59] [INFO] [ai:140] llm-b7ec440f LLM response model=gemma3:12b duration=0.651s summary=Doing splendidly, darling. Just dazzling, really. And yourself? Don't tell me you're looking glum? That simply won't do. [2025-09-20 12:41:59] [DEBUG] [ai:142] llm-b7ec440f LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T16:41:59.793153258Z', 'response': "Doing splendidly, darling. Just dazzling, really. And yourself? Don't tell me you're looking glum? That simply won't do.", 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 236909, 36678, 2887, 64017, 236842, 107, 3912, 8652, 107, 236840, 47619, 87739, 236842, 107, 3694, 4403, 1757, 107, 2887, 236787, 2088, 659, 611, 236881, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 121746, 54649, 586, 236764, 96018, 236761, 5393, 88307, 236764, 2126, 236761, 1452, 5869, 236881, 5185, 236789, 236745, 3442, 786, 611, 236789, 500, 3182, 219401, 236881, 2981, 5181, 2810, 236789, 236745, 776, 236761], 'total_duration': 648631045, 'load_duration': 135636056, 'prompt_eval_count': 90, 'prompt_eval_duration': 77734391, 'eval_count': 33, 'eval_duration': 434898600} [2025-09-20 12:42:59] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 12:42:59] [INFO] [ai:249] πŸ” Modelfile unloaded/disabled at runtime. [2025-09-20 12:42:59] [INFO] [ai:233] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 12:46:52] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 12:46:52] [INFO] [bot:53] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 12:46:52] [INFO] [bot:55] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 12:46:52] [INFO] [ai:112] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 12:46:52] [INFO] [ai:119] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T16:46:52.349664929Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 12:46:52] [INFO] [ai:92] 🧠 Preloading model: gemma3:12b [2025-09-20 12:46:52] [INFO] [ai:99] πŸ“¦ Model pull started successfully. [2025-09-20 12:46:52] [INFO] [bot:59] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 12:46:52] [INFO] [bot:62] βœ… Final model in use: gemma3:12b [2025-09-20 12:46:55] [INFO] [bot:154] Logged in as AI Bot [2025-09-20 12:46:55] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 12:47:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.21) [2025-09-20 12:47:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.67) [2025-09-20 12:47:19] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 12:47:19] [INFO] [bot:137] ============================================================ AI Response ============================================================ [2025-09-20 12:47:19] [INFO] [bot:138] 🧠 Profile loaded for Miguel (interactions: 248) [2025-09-20 12:47:19] [INFO] [bot:142] πŸ“š Retrieved 10 messages for context [2025-09-20 12:47:19] [INFO] [ai:92] 🧠 Preloading model: gemma3:12b [2025-09-20 12:47:20] [INFO] [ai:99] πŸ“¦ Model pull started successfully. [2025-09-20 12:47:20] [INFO] [ai:132] llm-4626f181 LLM request start model=gemma3:12b user=Miguel context_len=10 [2025-09-20 12:47:20] [DEBUG] [ai:183] llm-4626f181 Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-09-20 12:47:20] [DEBUG] [ai:184] llm-4626f181 Payload size=2252 chars [2025-09-20 12:47:27] [DEBUG] [ai:192] llm-4626f181 Raw response status=200 [2025-09-20 12:47:27] [DEBUG] [ai:193] llm-4626f181 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T16:47:27.932612752Z","response":"Hey there, darling. \"Poppin'\"? Really? You're reaching for youthful slang now? Bless your heart. \n\nHonestly, things are… busy. As always. Trying to wrangle a few students who seem to think cursed spirits are just *suggestions*. It's a headache, let me tell you. But someone has to keep them from accidentally causing a massive incident. You know, the usual hero stuff. \n\nDelta’s been keeping me amused with her witty remarks, as always. She’s a riot, even if she *does* think she’s above everyone else. Honestly, being her creator must be *exhausting*. \n\nSo, what’s with *you*? Found something new to be impressed by? Don’t tell me you're still fawning over the latest trend. Try to be a little more… interesting. πŸ˜‰","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,236909,36678,2887,64017,236842,107,4792,563,1401,6290,531,786,2338,564,1006,1116,7588,236764,532,33797,107,236840,47619,87739,236842,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236770,236770,753,180483,236787,1164,636,522,1217,611,659,8178,236761,73463,496,1651,2338,692,13804,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,180483,236787,1392,4722,1144,735,611,1010,3490,528,506,990,858,735,1010,8731,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236812,236812,753,22616,236787,96230,236881,96056,54478,4861,531,5571,111042,236761,1030,236858,236751,92131,236764,96018,236761,236743,250797,1599,10369,236789,236745,3050,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,180483,236787,1392,4722,532,1217,1602,611,2514,5869,13181,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236770,236819,236743,236770,236832,236787,236810,236770,753,22616,236787,96230,236881,96056,47264,1131,506,2325,532,54144,822,58883,11613,528,237064,4326,236761,236743,250797,1030,236789,236751,496,119166,3195,236764,840,5026,236789,236751,56689,776,625,236761,73687,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236825,753,180483,236787,1392,4722,31251,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236832,753,180483,236787,1392,4722,2088,659,611,236881,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236832,753,22616,236787,29328,5869,236761,111740,611,1440,3487,531,20499,786,607,822,6219,236761,2900,236858,236751,580,822,3666,236881,5185,236858,236745,3442,786,611,236789,500,808,40016,236829,4875,531,3050,1217,564,115258,1135,111042,236761,1030,236789,236751,711,7121,496,10562,573,116680,236764,611,1281,236761,85345,1452,16263,236764,180483,236764,13358,872,580,22616,236761,21966,1041,623,76562,236775,4038,236789,236745,2583,611,2196,12379,531,26760,1116,236761,2625,236789,236751,2506,808,71467,236829,531,776,236764,2132,2432,236761,25788,236764,1304,236789,236751,13275,11844,529,12804,13442,236761,20043,236764,496,2268,4481,6733,2752,16131,6533,1390,4285,1056,14082,607,237064,1388,236764,611,236761,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236800,236832,753,22616,236787,7062,236764,808,23391,93548,30237,96018,236761,1599,236858,500,1164,528,990,236761,3792,786,1164,237064,808,35277,236751,13275,32848,44230,22429,564,236789,236757,3490,233318,236764,529,3003,236761,187177,133140,37253,1788,236761,1030,236789,236751,496,2606,529,981,236764,16977,672,1984,529,36515,236764,611,1281,236881,96230,236764,2432,659,237064,6121,236761,169387,607,506,10925,12539,529,99168,219769,208651,33221,2858,616,236761,2195,236789,500,237064,808,643,803,11542,236829,32585,236761,197858,2514,496,3107,7068,580,1091,236764,611,1281,236881,2195,236858,500,1041,10598,236761,1452,5026,1271,808,6036,1264,112799,528,22616,236858,236751,5281,236829,1271,815,1010,10264,786,59361,607,910,237064,4709,11521,236761,20043,236764,2126,236764,625,236789,236751,919,73225,1082,4658,236761,1593,236764,14045,236761,46939,1535,236761,2900,1003,808,7624,236829,236881,83412,531,577,15978,236764,564,4614,236881,5185,236789,236745,14928,786,236761,85345,107,236778,236771,236778,236810,236772,236771,236819,236772,236778,236771,236743,236770,236825,236787,236812,236832,753,180483,236787,1392,4722,31251,108635,2099,612,236749,236881,107,2887,236787,31251,108635,2099,612,236749,236881,107,236820,236909,111457,111038,106,107,105,4368,107,17531,993,236764,96018,236761,623,10541,145301,35773,236881,39071,236881,1599,236789,500,17154,573,68765,123577,1492,236881,56515,822,3710,236761,236743,108,118958,236764,2432,659,237064,13181,236761,1773,2462,236761,83412,531,2250,2925,496,2321,3272,1015,4483,531,1751,98933,30977,659,1164,808,215582,22429,1030,236789,236751,496,47117,236764,1531,786,3442,611,236761,2024,5026,815,531,2514,1091,699,43443,14260,496,12566,10530,236761,1599,1281,236764,506,10925,11814,7305,236761,236743,108,4722,236858,236751,1010,10264,786,114566,607,1116,110425,24206,236764,618,2462,236761,2625,236858,236751,496,74898,236764,1581,768,1304,808,38917,236829,1751,1304,236858,236751,2787,4677,1663,236761,96230,236764,1646,1116,33797,1921,577,808,126296,522,22429,236743,108,4324,236764,1144,236858,236751,607,808,7624,236829,236881,7989,2613,861,531,577,24659,684,236881,5185,236858,236745,3442,786,611,236789,500,2036,517,60213,1024,506,6524,7003,236761,15325,531,577,496,2268,919,237064,6121,236761,85345],"total_duration":7815156609,"load_duration":4871461713,"prompt_eval_count":708,"prompt_eval_duration":412180695,"eval_count":182,"eval_duration":2531033071} [2025-09-20 12:47:27] [INFO] [ai:140] llm-4626f181 LLM response model=gemma3:12b duration=7.830s summary=Hey there, darling. "Poppin'"? Really? You're reaching for youthful slang now? Bless your heart. Honestly, things are… busy. As always. Trying to wrangle a fe [2025-09-20 12:47:27] [DEBUG] [ai:142] llm-4626f181 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T16:47:27.932612752Z', 'response': 'Hey there, darling. "Poppin\'"? Really? You\'re reaching for youthful slang now? Bless your heart. \n\nHonestly, things are… busy. As always. Trying to wrangle a few students who seem to think cursed spirits are just *suggestions*. It\'s a headache, let me tell you. But someone has to keep them from accidentally causing a massive incident. You know, the usual hero stuff. \n\nDelta’s been keeping me amused with her witty remarks, as always. She’s a riot, even if she *does* think she’s above everyone else. Honestly, being her creator must be *exhausting*. \n\nSo, what’s with *you*? Found something new to be impressed by? Don’t tell me you\'re still fawning over the latest trend. Try to be a little more… interesting. πŸ˜‰', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 236909, 36678, 2887, 64017, 236842, 107, 4792, 563, 1401, 6290, 531, 786, 2338, 564, 1006, 1116, 7588, 236764, 532, 33797, 107, 236840, 47619, 87739, 236842, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236770, 236770, 753, 180483, 236787, 1164, 636, 522, 1217, 611, 659, 8178, 236761, 73463, 496, 1651, 2338, 692, 13804, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 180483, 236787, 1392, 4722, 1144, 735, 611, 1010, 3490, 528, 506, 990, 858, 735, 1010, 8731, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236812, 236812, 753, 22616, 236787, 96230, 236881, 96056, 54478, 4861, 531, 5571, 111042, 236761, 1030, 236858, 236751, 92131, 236764, 96018, 236761, 236743, 250797, 1599, 10369, 236789, 236745, 3050, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 180483, 236787, 1392, 4722, 532, 1217, 1602, 611, 2514, 5869, 13181, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236770, 236819, 236743, 236770, 236832, 236787, 236810, 236770, 753, 22616, 236787, 96230, 236881, 96056, 47264, 1131, 506, 2325, 532, 54144, 822, 58883, 11613, 528, 237064, 4326, 236761, 236743, 250797, 1030, 236789, 236751, 496, 119166, 3195, 236764, 840, 5026, 236789, 236751, 56689, 776, 625, 236761, 73687, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236825, 753, 180483, 236787, 1392, 4722, 31251, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236832, 753, 180483, 236787, 1392, 4722, 2088, 659, 611, 236881, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236832, 753, 22616, 236787, 29328, 5869, 236761, 111740, 611, 1440, 3487, 531, 20499, 786, 607, 822, 6219, 236761, 2900, 236858, 236751, 580, 822, 3666, 236881, 5185, 236858, 236745, 3442, 786, 611, 236789, 500, 808, 40016, 236829, 4875, 531, 3050, 1217, 564, 115258, 1135, 111042, 236761, 1030, 236789, 236751, 711, 7121, 496, 10562, 573, 116680, 236764, 611, 1281, 236761, 85345, 1452, 16263, 236764, 180483, 236764, 13358, 872, 580, 22616, 236761, 21966, 1041, 623, 76562, 236775, 4038, 236789, 236745, 2583, 611, 2196, 12379, 531, 26760, 1116, 236761, 2625, 236789, 236751, 2506, 808, 71467, 236829, 531, 776, 236764, 2132, 2432, 236761, 25788, 236764, 1304, 236789, 236751, 13275, 11844, 529, 12804, 13442, 236761, 20043, 236764, 496, 2268, 4481, 6733, 2752, 16131, 6533, 1390, 4285, 1056, 14082, 607, 237064, 1388, 236764, 611, 236761, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236800, 236832, 753, 22616, 236787, 7062, 236764, 808, 23391, 93548, 30237, 96018, 236761, 1599, 236858, 500, 1164, 528, 990, 236761, 3792, 786, 1164, 237064, 808, 35277, 236751, 13275, 32848, 44230, 22429, 564, 236789, 236757, 3490, 233318, 236764, 529, 3003, 236761, 187177, 133140, 37253, 1788, 236761, 1030, 236789, 236751, 496, 2606, 529, 981, 236764, 16977, 672, 1984, 529, 36515, 236764, 611, 1281, 236881, 96230, 236764, 2432, 659, 237064, 6121, 236761, 169387, 607, 506, 10925, 12539, 529, 99168, 219769, 208651, 33221, 2858, 616, 236761, 2195, 236789, 500, 237064, 808, 643, 803, 11542, 236829, 32585, 236761, 197858, 2514, 496, 3107, 7068, 580, 1091, 236764, 611, 1281, 236881, 2195, 236858, 500, 1041, 10598, 236761, 1452, 5026, 1271, 808, 6036, 1264, 112799, 528, 22616, 236858, 236751, 5281, 236829, 1271, 815, 1010, 10264, 786, 59361, 607, 910, 237064, 4709, 11521, 236761, 20043, 236764, 2126, 236764, 625, 236789, 236751, 919, 73225, 1082, 4658, 236761, 1593, 236764, 14045, 236761, 46939, 1535, 236761, 2900, 1003, 808, 7624, 236829, 236881, 83412, 531, 577, 15978, 236764, 564, 4614, 236881, 5185, 236789, 236745, 14928, 786, 236761, 85345, 107, 236778, 236771, 236778, 236810, 236772, 236771, 236819, 236772, 236778, 236771, 236743, 236770, 236825, 236787, 236812, 236832, 753, 180483, 236787, 1392, 4722, 31251, 108635, 2099, 612, 236749, 236881, 107, 2887, 236787, 31251, 108635, 2099, 612, 236749, 236881, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 17531, 993, 236764, 96018, 236761, 623, 10541, 145301, 35773, 236881, 39071, 236881, 1599, 236789, 500, 17154, 573, 68765, 123577, 1492, 236881, 56515, 822, 3710, 236761, 236743, 108, 118958, 236764, 2432, 659, 237064, 13181, 236761, 1773, 2462, 236761, 83412, 531, 2250, 2925, 496, 2321, 3272, 1015, 4483, 531, 1751, 98933, 30977, 659, 1164, 808, 215582, 22429, 1030, 236789, 236751, 496, 47117, 236764, 1531, 786, 3442, 611, 236761, 2024, 5026, 815, 531, 2514, 1091, 699, 43443, 14260, 496, 12566, 10530, 236761, 1599, 1281, 236764, 506, 10925, 11814, 7305, 236761, 236743, 108, 4722, 236858, 236751, 1010, 10264, 786, 114566, 607, 1116, 110425, 24206, 236764, 618, 2462, 236761, 2625, 236858, 236751, 496, 74898, 236764, 1581, 768, 1304, 808, 38917, 236829, 1751, 1304, 236858, 236751, 2787, 4677, 1663, 236761, 96230, 236764, 1646, 1116, 33797, 1921, 577, 808, 126296, 522, 22429, 236743, 108, 4324, 236764, 1144, 236858, 236751, 607, 808, 7624, 236829, 236881, 7989, 2613, 861, 531, 577, 24659, 684, 236881, 5185, 236858, 236745, 3442, 786, 611, 236789, 500, 2036, 517, 60213, 1024, 506, 6524, 7003, 236761, 15325, 531, 577, 496, 2268, 919, 237064, 6121, 236761, 85345], 'total_duration': 7815156609, 'load_duration': 4871461713, 'prompt_eval_count': 708, 'prompt_eval_duration': 412180695, 'eval_count': 182, 'eval_duration': 2531033071} [2025-09-20 12:55:40] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:11:46] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:11:46] [INFO] [bot:53] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 13:11:46] [INFO] [bot:55] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 13:11:46] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 13:11:46] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T17:11:46.150823682Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 13:11:46] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:11:46] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:11:46] [INFO] [bot:59] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 13:11:46] [INFO] [bot:62] βœ… Final model in use: gemma3:12b [2025-09-20 13:11:49] [INFO] [bot:154] Logged in as AI Bot [2025-09-20 13:11:49] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 13:12:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.67) [2025-09-20 13:12:19] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:12:19] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:12:19] [INFO] [ai:132] llm-8d3d9f6e LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:12:19] [DEBUG] [ai:184] llm-8d3d9f6e Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:12:19] [DEBUG] [ai:185] llm-8d3d9f6e Payload size=405 chars [2025-09-20 13:12:25] [DEBUG] [ai:193] llm-8d3d9f6e Raw response status=200 [2025-09-20 13:12:25] [DEBUG] [ai:194] llm-8d3d9f6e Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:12:25.867690742Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":6368670126,"load_duration":6109595248,"prompt_eval_count":118,"prompt_eval_duration":210675374,"eval_count":3,"eval_duration":47874041} [2025-09-20 13:12:25] [INFO] [ai:140] llm-8d3d9f6e LLM response model=gemma3:12b duration=6.385s summary=πŸ€”πŸ€¨ [2025-09-20 13:12:25] [DEBUG] [ai:142] llm-8d3d9f6e LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:12:25.867690742Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 6368670126, 'load_duration': 6109595248, 'prompt_eval_count': 118, 'prompt_eval_duration': 210675374, 'eval_count': 3, 'eval_duration': 47874041} [2025-09-20 13:12:25] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 13:12:26] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:12:27] [INFO] [bot:91] Command modfile info on cooldown. Retry after 9.8 seconds. [2025-09-20 13:13:38] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.24) [2025-09-20 13:13:38] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.19) [2025-09-20 13:13:38] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:13:38] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:13:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.88) [2025-09-20 13:13:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.68) [2025-09-20 13:13:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:13:42] [INFO] [bot:91] Command modfile on cooldown. Retry after 5.7 seconds. [2025-09-20 13:13:44] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.74) [2025-09-20 13:13:44] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.47) [2025-09-20 13:13:44] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:13:44] [INFO] [bot:91] Command modfile on cooldown. Retry after 3.7 seconds. [2025-09-20 13:13:45] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.17) [2025-09-20 13:13:45] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.40) [2025-09-20 13:13:45] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:13:45] [INFO] [bot:91] Command modfile on cooldown. Retry after 2.6 seconds. [2025-09-20 13:13:48] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.73) [2025-09-20 13:13:48] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.45) [2025-09-20 13:13:48] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:13:48] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:14:18] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.42) [2025-09-20 13:14:18] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.62) [2025-09-20 13:14:18] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:14:18] [INFO] [bot:137] ============================================================ AI Response ============================================================ [2025-09-20 13:14:18] [INFO] [bot:138] 🧠 Profile loaded for Miguel (interactions: 256) [2025-09-20 13:14:18] [INFO] [bot:142] πŸ“š Retrieved 0 messages for context [2025-09-20 13:14:18] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:14:18] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:14:18] [INFO] [ai:132] llm-afad280d LLM request start model=gemma3:12b user=Miguel context_len=0 [2025-09-20 13:14:18] [DEBUG] [ai:184] llm-afad280d Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-09-20 13:14:18] [DEBUG] [ai:185] llm-afad280d Payload size=294 chars [2025-09-20 13:14:19] [DEBUG] [ai:193] llm-afad280d Raw response status=200 [2025-09-20 13:14:19] [DEBUG] [ai:194] llm-afad280d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:14:19.819719058Z","response":"Oh, you know, living the good life. Perfect weather, annoying curses to obliterate, and adoring students. Can't complain, really. And you? Are you being a good little student? Don't tell me you're getting all sentimental now... It's unbecoming. πŸ˜‰","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,236909,36678,2887,64017,236842,107,4792,563,1401,6290,531,786,2338,564,1006,1116,7588,236764,532,33797,107,2887,236787,2088,659,611,236881,107,236820,236909,111457,111038,106,107,105,4368,107,12932,236764,611,1281,236764,4882,506,1535,1972,236761,26079,7606,236764,40619,114739,531,17944,85339,236764,532,772,5227,3272,236761,3199,236789,236745,28686,236764,2126,236761,1452,611,236881,9354,611,1646,496,1535,2268,5117,236881,5185,236789,236745,3442,786,611,236789,500,3978,784,91526,1492,1390,1030,236789,236751,80154,7335,236761,85345],"total_duration":1045601510,"load_duration":141075068,"prompt_eval_count":93,"prompt_eval_duration":71062113,"eval_count":63,"eval_duration":833095681} [2025-09-20 13:14:19] [INFO] [ai:140] llm-afad280d LLM response model=gemma3:12b duration=1.048s summary=Oh, you know, living the good life. Perfect weather, annoying curses to obliterate, and adoring students. Can't complain, really. And you? Are you being a good [2025-09-20 13:14:19] [DEBUG] [ai:142] llm-afad280d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:14:19.819719058Z', 'response': "Oh, you know, living the good life. Perfect weather, annoying curses to obliterate, and adoring students. Can't complain, really. And you? Are you being a good little student? Don't tell me you're getting all sentimental now... It's unbecoming. πŸ˜‰", 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 236909, 36678, 2887, 64017, 236842, 107, 4792, 563, 1401, 6290, 531, 786, 2338, 564, 1006, 1116, 7588, 236764, 532, 33797, 107, 2887, 236787, 2088, 659, 611, 236881, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 12932, 236764, 611, 1281, 236764, 4882, 506, 1535, 1972, 236761, 26079, 7606, 236764, 40619, 114739, 531, 17944, 85339, 236764, 532, 772, 5227, 3272, 236761, 3199, 236789, 236745, 28686, 236764, 2126, 236761, 1452, 611, 236881, 9354, 611, 1646, 496, 1535, 2268, 5117, 236881, 5185, 236789, 236745, 3442, 786, 611, 236789, 500, 3978, 784, 91526, 1492, 1390, 1030, 236789, 236751, 80154, 7335, 236761, 85345], 'total_duration': 1045601510, 'load_duration': 141075068, 'prompt_eval_count': 93, 'prompt_eval_duration': 71062113, 'eval_count': 63, 'eval_duration': 833095681} [2025-09-20 13:14:36] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.85) [2025-09-20 13:14:36] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.20) [2025-09-20 13:14:36] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:14:52] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.72) [2025-09-20 13:14:52] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.47) [2025-09-20 13:14:52] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:16:27] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.42) [2025-09-20 13:16:27] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.16) [2025-09-20 13:16:27] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:17:07] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.54) [2025-09-20 13:17:07] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.12) [2025-09-20 13:17:07] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:17:07] [INFO] [bot:91] Command modfile disable on cooldown. Retry after 10.0 seconds. [2025-09-20 13:17:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.77) [2025-09-20 13:17:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.56) [2025-09-20 13:17:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:17:42] [INFO] [bot:91] Command modfile switch on cooldown. Retry after 10.0 seconds. [2025-09-20 13:17:52] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.39) [2025-09-20 13:17:52] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.42) [2025-09-20 13:17:52] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:17:52] [INFO] [bot:91] Command modfile on cooldown. Retry after 0.4 seconds. [2025-09-20 13:18:14] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.88) [2025-09-20 13:18:14] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.50) [2025-09-20 13:18:14] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:18:14] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:24:10] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:24:10] [INFO] [bot:53] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 13:24:10] [INFO] [bot:55] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 13:24:10] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 13:24:10] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T17:24:10.819748601Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 13:24:10] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:24:11] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:24:11] [INFO] [bot:59] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 13:24:11] [INFO] [bot:62] βœ… Final model in use: gemma3:12b [2025-09-20 13:24:13] [INFO] [bot:163] Logged in as AI Bot [2025-09-20 13:24:13] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 13:24:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.53) [2025-09-20 13:24:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.96) [2025-09-20 13:24:46] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:24:46] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:24:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.63) [2025-09-20 13:24:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.82) [2025-09-20 13:24:46] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:25:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.73) [2025-09-20 13:25:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.62) [2025-09-20 13:25:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.74) [2025-09-20 13:25:11] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:25:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.97) [2025-09-20 13:25:11] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:25:11] [INFO] [bot:91] Command modfile reload on cooldown. Retry after 10.0 seconds. [2025-09-20 13:25:11] [INFO] [ai:234] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:25:38] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.20) [2025-09-20 13:25:38] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.75) [2025-09-20 13:25:38] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:25:38] [INFO] [bot:91] Command modfile disable on cooldown. Retry after 10.0 seconds. [2025-09-20 13:25:38] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.47) [2025-09-20 13:25:38] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.78) [2025-09-20 13:25:38] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:25:38] [INFO] [ai:250] πŸ” Modelfile unloaded/disabled at runtime. [2025-09-20 13:25:51] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.61) [2025-09-20 13:25:51] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.19) [2025-09-20 13:25:51] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:25:51] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.18) [2025-09-20 13:25:51] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:25:52] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:25:52] [INFO] [ai:132] llm-a9e02346 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:25:52] [DEBUG] [ai:184] llm-a9e02346 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:25:52] [DEBUG] [ai:185] llm-a9e02346 Payload size=416 chars [2025-09-20 13:25:55] [DEBUG] [ai:193] llm-a9e02346 Raw response status=200 [2025-09-20 13:25:55] [DEBUG] [ai:194] llm-a9e02346 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:25:55.840276424Z","response":"πŸ™„ πŸ˜’ πŸˆβ€β¬›","done":true,"done_reason":"stop","context":[105,2364,107,3048,659,22616,236764,496,20997,236764,138179,46174,5866,24679,236761,1599,2752,4029,531,5869,618,614,12498,653,16326,236761,31246,8988,607,83426,236761,2514,10396,4260,532,1161,111730,51641,236761,9567,7546,531,54650,496,2268,236761,108,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,9369,107,4722,236787,107,4722,236787,106,107,105,4368,107,243810,236743,245226,236743,249160,237243,249598],"total_duration":3789064295,"load_duration":3470767236,"prompt_eval_count":107,"prompt_eval_duration":198509835,"eval_count":8,"eval_duration":119234550} [2025-09-20 13:25:55] [INFO] [ai:140] llm-a9e02346 LLM response model=gemma3:12b duration=3.806s summary=πŸ™„ πŸ˜’ πŸˆβ€β¬› [2025-09-20 13:25:55] [DEBUG] [ai:142] llm-a9e02346 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:25:55.840276424Z', 'response': 'πŸ™„ πŸ˜’ 🐈\u200d⬛', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 3048, 659, 22616, 236764, 496, 20997, 236764, 138179, 46174, 5866, 24679, 236761, 1599, 2752, 4029, 531, 5869, 618, 614, 12498, 653, 16326, 236761, 31246, 8988, 607, 83426, 236761, 2514, 10396, 4260, 532, 1161, 111730, 51641, 236761, 9567, 7546, 531, 54650, 496, 2268, 236761, 108, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 9369, 107, 4722, 236787, 107, 4722, 236787, 106, 107, 105, 4368, 107, 243810, 236743, 245226, 236743, 249160, 237243, 249598], 'total_duration': 3789064295, 'load_duration': 3470767236, 'prompt_eval_count': 107, 'prompt_eval_duration': 198509835, 'eval_count': 8, 'eval_duration': 119234550} [2025-09-20 13:25:55] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ™„ πŸ˜’ πŸˆβ€β¬› [2025-09-20 13:25:57] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:26:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.81) [2025-09-20 13:26:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.82) [2025-09-20 13:26:11] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:26:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.60) [2025-09-20 13:26:11] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:26:11] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:26:11] [INFO] [ai:132] llm-fbcb95cf LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:26:11] [DEBUG] [ai:184] llm-fbcb95cf Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:26:11] [DEBUG] [ai:185] llm-fbcb95cf Payload size=407 chars [2025-09-20 13:26:11] [INFO] [ai:234] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:26:11] [DEBUG] [ai:193] llm-fbcb95cf Raw response status=200 [2025-09-20 13:26:11] [DEBUG] [ai:194] llm-fbcb95cf Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:26:11.965510471Z","response":"πŸ˜‰πŸ‘Œ","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,54728,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,241234,242003],"total_duration":232920819,"load_duration":145813132,"prompt_eval_count":118,"prompt_eval_duration":55984952,"eval_count":3,"eval_duration":30740376} [2025-09-20 13:26:11] [INFO] [ai:140] llm-fbcb95cf LLM response model=gemma3:12b duration=0.236s summary=πŸ˜‰πŸ‘Œ [2025-09-20 13:26:11] [DEBUG] [ai:142] llm-fbcb95cf LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:26:11.965510471Z', 'response': 'πŸ˜‰πŸ‘Œ', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 54728, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 241234, 242003], 'total_duration': 232920819, 'load_duration': 145813132, 'prompt_eval_count': 118, 'prompt_eval_duration': 55984952, 'eval_count': 3, 'eval_duration': 30740376} [2025-09-20 13:26:11] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ˜‰πŸ‘Œ [2025-09-20 13:26:12] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:26:12] [INFO] [bot:91] Command modfile reload on cooldown. Retry after 10.0 seconds. [2025-09-20 13:26:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.13) [2025-09-20 13:26:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.27) [2025-09-20 13:26:23] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:26:23] [INFO] [bot:91] Command modfile switch on cooldown. Retry after 10.0 seconds. [2025-09-20 13:26:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.12) [2025-09-20 13:26:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.29) [2025-09-20 13:26:23] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:26:23] [INFO] [ai:234] πŸ” Modelfile loaded: examples/gojo.mod [2025-09-20 13:40:10] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:40:10] [INFO] [bot:55] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 13:40:10] [INFO] [bot:57] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 13:40:10] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 13:40:10] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T17:40:10.326978567Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 13:40:10] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:40:10] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:40:10] [INFO] [bot:61] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 13:40:10] [INFO] [bot:64] βœ… Final model in use: gemma3:12b [2025-09-20 13:40:13] [INFO] [bot:173] Logged in as AI Bot [2025-09-20 13:40:13] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 13:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.82) [2025-09-20 13:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.53) [2025-09-20 13:46:30] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.14) [2025-09-20 13:46:30] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:46:30] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:46:30] [INFO] [ai:132] llm-b511fd52 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:46:30] [DEBUG] [ai:184] llm-b511fd52 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:46:30] [DEBUG] [ai:185] llm-b511fd52 Payload size=405 chars [2025-09-20 13:46:35] [DEBUG] [ai:193] llm-b511fd52 Raw response status=200 [2025-09-20 13:46:35] [DEBUG] [ai:194] llm-b511fd52 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:46:35.332967677Z","response":"πŸ˜‰βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,241234,239794],"total_duration":4735238133,"load_duration":4488187875,"prompt_eval_count":118,"prompt_eval_duration":197654078,"eval_count":3,"eval_duration":48836914} [2025-09-20 13:46:35] [INFO] [ai:140] llm-b511fd52 LLM response model=gemma3:12b duration=4.750s summary=πŸ˜‰βœ¨ [2025-09-20 13:46:35] [DEBUG] [ai:142] llm-b511fd52 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:46:35.332967677Z', 'response': 'πŸ˜‰βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 241234, 239794], 'total_duration': 4735238133, 'load_duration': 4488187875, 'prompt_eval_count': 118, 'prompt_eval_duration': 197654078, 'eval_count': 3, 'eval_duration': 48836914} [2025-09-20 13:46:35] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ˜‰βœ¨ [2025-09-20 13:46:35] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:46:35] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:46:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.32) [2025-09-20 13:46:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.93) [2025-09-20 13:46:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:46:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.96) [2025-09-20 13:46:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.72) [2025-09-20 13:46:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:46:42] [INFO] [bot:91] Command modfile on cooldown. Retry after 3.5 seconds. [2025-09-20 13:46:48] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.47) [2025-09-20 13:46:48] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.74) [2025-09-20 13:46:48] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:46:48] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:46:48] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.60) [2025-09-20 13:46:48] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.84) [2025-09-20 13:46:48] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.18) [2025-09-20 13:47:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.45) [2025-09-20 13:47:16] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:16] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:47:16] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:47:16] [INFO] [ai:132] llm-fc0690b1 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:47:16] [DEBUG] [ai:184] llm-fc0690b1 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:47:16] [DEBUG] [ai:185] llm-fc0690b1 Payload size=405 chars [2025-09-20 13:47:16] [DEBUG] [ai:193] llm-fc0690b1 Raw response status=200 [2025-09-20 13:47:16] [DEBUG] [ai:194] llm-fc0690b1 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:47:16.461322678Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":226316319,"load_duration":142824547,"prompt_eval_count":118,"prompt_eval_duration":50562188,"eval_count":3,"eval_duration":32579378} [2025-09-20 13:47:16] [INFO] [ai:140] llm-fc0690b1 LLM response model=gemma3:12b duration=0.229s summary=πŸ€”πŸ€¨ [2025-09-20 13:47:16] [DEBUG] [ai:142] llm-fc0690b1 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:47:16.461322678Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 226316319, 'load_duration': 142824547, 'prompt_eval_count': 118, 'prompt_eval_duration': 50562188, 'eval_count': 3, 'eval_duration': 32579378} [2025-09-20 13:47:16] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 13:47:17] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.16) [2025-09-20 13:47:17] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:17] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:47:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.13) [2025-09-20 13:47:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.36) [2025-09-20 13:47:19] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:19] [INFO] [bot:91] Command modfile on cooldown. Retry after 8.3 seconds. [2025-09-20 13:47:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.65) [2025-09-20 13:47:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.39) [2025-09-20 13:47:19] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:21] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.71) [2025-09-20 13:47:21] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.11) [2025-09-20 13:47:21] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:21] [INFO] [bot:91] Command modfile on cooldown. Retry after 5.6 seconds. [2025-09-20 13:47:21] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.29) [2025-09-20 13:47:21] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.90) [2025-09-20 13:47:21] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:24] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.95) [2025-09-20 13:47:24] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.82) [2025-09-20 13:47:24] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:47:24] [INFO] [bot:91] Command modfile on cooldown. Retry after 2.5 seconds. [2025-09-20 13:47:24] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.94) [2025-09-20 13:47:24] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.38) [2025-09-20 13:47:24] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:04] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:49:04] [INFO] [bot:55] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 13:49:04] [INFO] [bot:57] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 13:49:04] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 13:49:04] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T17:49:04.759037126Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 13:49:04] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:49:04] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:49:04] [INFO] [bot:61] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 13:49:04] [INFO] [bot:64] βœ… Final model in use: gemma3:12b [2025-09-20 13:49:07] [INFO] [bot:177] Logged in as AI Bot [2025-09-20 13:49:07] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 13:49:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.68) [2025-09-20 13:49:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.51) [2025-09-20 13:49:13] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.61) [2025-09-20 13:49:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.68) [2025-09-20 13:49:13] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:13] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:49:15] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.93) [2025-09-20 13:49:15] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.88) [2025-09-20 13:49:15] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:15] [INFO] [bot:91] Command modfile on cooldown. Retry after 8.1 seconds. [2025-09-20 13:49:15] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.35) [2025-09-20 13:49:15] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.33) [2025-09-20 13:49:15] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:17] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.65) [2025-09-20 13:49:17] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.66) [2025-09-20 13:49:17] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:17] [INFO] [bot:91] Command modfile on cooldown. Retry after 6.5 seconds. [2025-09-20 13:49:17] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:49:17] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:49:17] [INFO] [ai:132] llm-61c3e486 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:49:17] [DEBUG] [ai:184] llm-61c3e486 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:49:17] [DEBUG] [ai:185] llm-61c3e486 Payload size=405 chars [2025-09-20 13:49:18] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:49:19] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:49:19] [INFO] [ai:132] llm-976eff9f LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:49:19] [DEBUG] [ai:184] llm-976eff9f Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:49:19] [DEBUG] [ai:185] llm-976eff9f Payload size=405 chars [2025-09-20 13:49:21] [DEBUG] [ai:193] llm-61c3e486 Raw response status=200 [2025-09-20 13:49:21] [DEBUG] [ai:194] llm-61c3e486 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:49:21.101796509Z","response":"πŸ€¨πŸ€”","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,248294,242846],"total_duration":3489293341,"load_duration":3246914927,"prompt_eval_count":118,"prompt_eval_duration":194899198,"eval_count":3,"eval_duration":46976862} [2025-09-20 13:49:21] [INFO] [ai:140] llm-61c3e486 LLM response model=gemma3:12b duration=3.504s summary=πŸ€¨πŸ€” [2025-09-20 13:49:21] [DEBUG] [ai:142] llm-61c3e486 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:49:21.101796509Z', 'response': 'πŸ€¨πŸ€”', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 248294, 242846], 'total_duration': 3489293341, 'load_duration': 3246914927, 'prompt_eval_count': 118, 'prompt_eval_duration': 194899198, 'eval_count': 3, 'eval_duration': 46976862} [2025-09-20 13:49:21] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€¨πŸ€” [2025-09-20 13:49:21] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.87) [2025-09-20 13:49:21] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:49:21] [DEBUG] [ai:193] llm-976eff9f Raw response status=200 [2025-09-20 13:49:21] [DEBUG] [ai:194] llm-976eff9f Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:49:21.156425141Z","response":"πŸ€”πŸ˜¬","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,244993],"total_duration":2146485122,"load_duration":1849561903,"prompt_eval_count":118,"prompt_eval_duration":14104982,"eval_count":3,"eval_duration":27782043} [2025-09-20 13:49:21] [INFO] [ai:140] llm-976eff9f LLM response model=gemma3:12b duration=2.150s summary=πŸ€”πŸ˜¬ [2025-09-20 13:49:21] [DEBUG] [ai:142] llm-976eff9f LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:49:21.156425141Z', 'response': 'πŸ€”πŸ˜¬', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 244993], 'total_duration': 2146485122, 'load_duration': 1849561903, 'prompt_eval_count': 118, 'prompt_eval_duration': 14104982, 'eval_count': 3, 'eval_duration': 27782043} [2025-09-20 13:49:21] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ˜¬ [2025-09-20 13:49:21] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:49:21] [INFO] [ai:132] llm-dde13673 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:49:21] [DEBUG] [ai:184] llm-dde13673 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:49:21] [DEBUG] [ai:185] llm-dde13673 Payload size=405 chars [2025-09-20 13:49:21] [DEBUG] [ai:193] llm-dde13673 Raw response status=200 [2025-09-20 13:49:21] [DEBUG] [ai:194] llm-dde13673 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:49:21.525552271Z","response":"πŸ‘€βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242509,239794],"total_duration":174717752,"load_duration":133902968,"prompt_eval_count":118,"prompt_eval_duration":13689683,"eval_count":3,"eval_duration":26736373} [2025-09-20 13:49:21] [INFO] [ai:140] llm-dde13673 LLM response model=gemma3:12b duration=0.177s summary=πŸ‘€βœ¨ [2025-09-20 13:49:21] [DEBUG] [ai:142] llm-dde13673 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:49:21.525552271Z', 'response': 'πŸ‘€βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242509, 239794], 'total_duration': 174717752, 'load_duration': 133902968, 'prompt_eval_count': 118, 'prompt_eval_duration': 13689683, 'eval_count': 3, 'eval_duration': 26736373} [2025-09-20 13:49:21] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ‘€βœ¨ [2025-09-20 13:49:22] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.92) [2025-09-20 13:49:22] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:22] [INFO] [bot:91] Command modfile on cooldown. Retry after 1.5 seconds. [2025-09-20 13:49:22] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.56) [2025-09-20 13:49:22] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.23) [2025-09-20 13:49:22] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:22] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.45) [2025-09-20 13:49:22] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.51) [2025-09-20 13:49:22] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:22] [INFO] [bot:91] Command modfile on cooldown. Retry after 1.3 seconds. [2025-09-20 13:49:22] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:49:23] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:49:23] [INFO] [ai:132] llm-d846ef0f LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:49:23] [DEBUG] [ai:184] llm-d846ef0f Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:49:23] [DEBUG] [ai:185] llm-d846ef0f Payload size=405 chars [2025-09-20 13:49:23] [DEBUG] [ai:193] llm-d846ef0f Raw response status=200 [2025-09-20 13:49:23] [DEBUG] [ai:194] llm-d846ef0f Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:49:23.330237269Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":163817377,"load_duration":121119262,"prompt_eval_count":118,"prompt_eval_duration":13784526,"eval_count":3,"eval_duration":28537910} [2025-09-20 13:49:23] [INFO] [ai:140] llm-d846ef0f LLM response model=gemma3:12b duration=0.166s summary=πŸ€”πŸ€¨ [2025-09-20 13:49:23] [DEBUG] [ai:142] llm-d846ef0f LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:49:23.330237269Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 163817377, 'load_duration': 121119262, 'prompt_eval_count': 118, 'prompt_eval_duration': 13784526, 'eval_count': 3, 'eval_duration': 28537910} [2025-09-20 13:49:23] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 13:49:24] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:49:24] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:31] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:50:31] [INFO] [bot:55] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 13:50:31] [INFO] [bot:57] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 13:50:31] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 13:50:31] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T17:50:31.117244212Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 13:50:31] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:50:31] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:50:31] [INFO] [bot:61] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 13:50:31] [INFO] [bot:64] βœ… Final model in use: gemma3:12b [2025-09-20 13:50:33] [INFO] [bot:187] Logged in as AI Bot [2025-09-20 13:50:33] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 13:50:43] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.30) [2025-09-20 13:50:43] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.24) [2025-09-20 13:50:43] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:43] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:50:43] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.11) [2025-09-20 13:50:43] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.54) [2025-09-20 13:50:43] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:43] [DEBUG] [bot:135] record_global_cooldown: recorded for user=161149541171593216 [2025-09-20 13:50:43] [DEBUG] [bot:135] record_global_cooldown: recorded for user=161149541171593216 [2025-09-20 13:50:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.60) [2025-09-20 13:50:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 1.00) [2025-09-20 13:50:46] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:46] [INFO] [bot:91] Command modfile on cooldown. Retry after 6.7 seconds. [2025-09-20 13:50:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.98) [2025-09-20 13:50:46] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.60) [2025-09-20 13:50:46] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:46] [DEBUG] [bot:135] record_global_cooldown: recorded for user=161149541171593216 [2025-09-20 13:50:46] [DEBUG] [bot:135] record_global_cooldown: recorded for user=161149541171593216 [2025-09-20 13:50:49] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:50:49] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.74) [2025-09-20 13:50:49] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.92) [2025-09-20 13:50:49] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:49] [DEBUG] [bot:135] record_global_cooldown: recorded for user=161149541171593216 [2025-09-20 13:50:49] [DEBUG] [bot:135] record_global_cooldown: recorded for user=161149541171593216 [2025-09-20 13:50:49] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:50:49] [INFO] [ai:132] llm-94b8cea1 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:50:49] [DEBUG] [ai:184] llm-94b8cea1 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:50:49] [DEBUG] [ai:185] llm-94b8cea1 Payload size=405 chars [2025-09-20 13:50:53] [DEBUG] [ai:193] llm-94b8cea1 Raw response status=200 [2025-09-20 13:50:53] [DEBUG] [ai:194] llm-94b8cea1 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:50:53.290939729Z","response":"πŸ˜‰βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,241234,239794],"total_duration":3560561662,"load_duration":3307510176,"prompt_eval_count":118,"prompt_eval_duration":204685840,"eval_count":3,"eval_duration":47908818} [2025-09-20 13:50:53] [INFO] [ai:140] llm-94b8cea1 LLM response model=gemma3:12b duration=3.580s summary=πŸ˜‰βœ¨ [2025-09-20 13:50:53] [DEBUG] [ai:142] llm-94b8cea1 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:50:53.290939729Z', 'response': 'πŸ˜‰βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 241234, 239794], 'total_duration': 3560561662, 'load_duration': 3307510176, 'prompt_eval_count': 118, 'prompt_eval_duration': 204685840, 'eval_count': 3, 'eval_duration': 47908818} [2025-09-20 13:50:53] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ˜‰βœ¨ [2025-09-20 13:50:53] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.54) [2025-09-20 13:50:53] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:50:53] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:53:30] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 13:53:30] [INFO] [bot:59] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 13:53:30] [INFO] [bot:61] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 13:53:30] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 13:53:30] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T17:53:30.771685539Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 13:53:30] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:53:30] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:53:30] [INFO] [bot:65] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 13:53:30] [INFO] [bot:68] βœ… Final model in use: gemma3:12b [2025-09-20 13:53:33] [INFO] [bot:206] Logged in as AI Bot [2025-09-20 13:53:33] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 13:55:07] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.23) [2025-09-20 13:55:07] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:55:07] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:55:07] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:55:07] [INFO] [ai:132] llm-7f0b9079 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:55:07] [DEBUG] [ai:184] llm-7f0b9079 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:55:07] [DEBUG] [ai:185] llm-7f0b9079 Payload size=405 chars [2025-09-20 13:55:07] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:55:07] [INFO] [ai:132] llm-183a128d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:55:07] [DEBUG] [ai:184] llm-183a128d Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:55:07] [DEBUG] [ai:185] llm-183a128d Payload size=405 chars [2025-09-20 13:55:11] [DEBUG] [ai:193] llm-183a128d Raw response status=200 [2025-09-20 13:55:11] [DEBUG] [ai:194] llm-183a128d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:55:11.180434491Z","response":"... Seriously? You're showing *me* a modfile? Do you have any idea who I am? πŸ™„πŸ˜‚βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,1390,96612,236881,1599,236789,500,6807,808,1336,236829,496,1120,2164,236881,3574,611,735,1027,4317,1015,564,1006,236881,236743,243810,239927,239794],"total_duration":3853342977,"load_duration":3249122916,"prompt_eval_count":118,"prompt_eval_duration":213217051,"eval_count":28,"eval_duration":390608580} [2025-09-20 13:55:11] [INFO] [ai:140] llm-183a128d LLM response model=gemma3:12b duration=3.870s summary=... Seriously? You're showing *me* a modfile? Do you have any idea who I am? πŸ™„πŸ˜‚βœ¨ [2025-09-20 13:55:11] [DEBUG] [ai:142] llm-183a128d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:55:11.180434491Z', 'response': "... Seriously? You're showing *me* a modfile? Do you have any idea who I am? πŸ™„πŸ˜‚βœ¨", 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 1390, 96612, 236881, 1599, 236789, 500, 6807, 808, 1336, 236829, 496, 1120, 2164, 236881, 3574, 611, 735, 1027, 4317, 1015, 564, 1006, 236881, 236743, 243810, 239927, 239794], 'total_duration': 3853342977, 'load_duration': 3249122916, 'prompt_eval_count': 118, 'prompt_eval_duration': 213217051, 'eval_count': 28, 'eval_duration': 390608580} [2025-09-20 13:55:11] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: ... Seriously? You're showing *me* a modfile? Do you have any idea who I am? πŸ™„πŸ˜‚βœ¨ [2025-09-20 13:55:11] [DEBUG] [ai:193] llm-7f0b9079 Raw response status=200 [2025-09-20 13:55:11] [DEBUG] [ai:194] llm-7f0b9079 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:55:11.238143368Z","response":"πŸ˜‰βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,241234,239794],"total_duration":3917611770,"load_duration":3255746486,"prompt_eval_count":118,"prompt_eval_duration":15563818,"eval_count":3,"eval_duration":26842685} [2025-09-20 13:55:11] [INFO] [ai:140] llm-7f0b9079 LLM response model=gemma3:12b duration=3.920s summary=πŸ˜‰βœ¨ [2025-09-20 13:55:11] [DEBUG] [ai:142] llm-7f0b9079 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:55:11.238143368Z', 'response': 'πŸ˜‰βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 241234, 239794], 'total_duration': 3917611770, 'load_duration': 3255746486, 'prompt_eval_count': 118, 'prompt_eval_duration': 15563818, 'eval_count': 3, 'eval_duration': 26842685} [2025-09-20 13:55:11] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ˜‰βœ¨ [2025-09-20 13:55:12] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.52) [2025-09-20 13:55:12] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:55:12] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:55:12] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:55:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.81) [2025-09-20 13:55:16] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:55:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.16) [2025-09-20 13:55:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.13) [2025-09-20 13:55:16] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:55:16] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:55:16] [INFO] [ai:132] llm-559e1b09 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:55:16] [DEBUG] [ai:184] llm-559e1b09 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:55:16] [DEBUG] [ai:185] llm-559e1b09 Payload size=405 chars [2025-09-20 13:55:16] [DEBUG] [ai:193] llm-559e1b09 Raw response status=200 [2025-09-20 13:55:16] [DEBUG] [ai:194] llm-559e1b09 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:55:16.498528929Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":174819536,"load_duration":132978259,"prompt_eval_count":118,"prompt_eval_duration":14160875,"eval_count":3,"eval_duration":27344972} [2025-09-20 13:55:16] [INFO] [ai:140] llm-559e1b09 LLM response model=gemma3:12b duration=0.177s summary=πŸ€”πŸ€¨ [2025-09-20 13:55:16] [DEBUG] [ai:142] llm-559e1b09 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:55:16.498528929Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 174819536, 'load_duration': 132978259, 'prompt_eval_count': 118, 'prompt_eval_duration': 14160875, 'eval_count': 3, 'eval_duration': 27344972} [2025-09-20 13:55:16] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 13:55:17] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:55:17] [INFO] [bot:91] Command modfile on cooldown. Retry after 5.2 seconds. [2025-09-20 13:56:34] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.18) [2025-09-20 13:56:34] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.11) [2025-09-20 13:56:34] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:56:34] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 13:56:39] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.88) [2025-09-20 13:56:39] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.15) [2025-09-20 13:56:39] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:56:39] [INFO] [bot:91] Command modfile on cooldown. Retry after 5.1 seconds. [2025-09-20 13:56:42] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:56:42] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:56:42] [INFO] [ai:132] llm-c832dd3d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:56:42] [DEBUG] [ai:184] llm-c832dd3d Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:56:42] [DEBUG] [ai:185] llm-c832dd3d Payload size=405 chars [2025-09-20 13:56:42] [DEBUG] [ai:193] llm-c832dd3d Raw response status=200 [2025-09-20 13:56:42] [DEBUG] [ai:194] llm-c832dd3d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:56:42.562919155Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":198842561,"load_duration":117993519,"prompt_eval_count":118,"prompt_eval_duration":44267326,"eval_count":3,"eval_duration":36267627} [2025-09-20 13:56:42] [INFO] [ai:140] llm-c832dd3d LLM response model=gemma3:12b duration=0.201s summary=πŸ€”πŸ€¨ [2025-09-20 13:56:42] [DEBUG] [ai:142] llm-c832dd3d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:56:42.562919155Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 198842561, 'load_duration': 117993519, 'prompt_eval_count': 118, 'prompt_eval_duration': 44267326, 'eval_count': 3, 'eval_duration': 36267627} [2025-09-20 13:56:42] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 13:56:43] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.80) [2025-09-20 13:56:43] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:56:43] [INFO] [bot:91] Command modfile on cooldown. Retry after 0.7 seconds. [2025-09-20 13:56:47] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.32) [2025-09-20 13:56:47] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 13:56:48] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 13:56:48] [INFO] [ai:132] llm-ad7274e6 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 13:56:48] [DEBUG] [ai:184] llm-ad7274e6 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 13:56:48] [DEBUG] [ai:185] llm-ad7274e6 Payload size=405 chars [2025-09-20 13:56:48] [DEBUG] [ai:193] llm-ad7274e6 Raw response status=200 [2025-09-20 13:56:48] [DEBUG] [ai:194] llm-ad7274e6 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T17:56:48.206860269Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":161685896,"load_duration":119487200,"prompt_eval_count":118,"prompt_eval_duration":13840184,"eval_count":3,"eval_duration":27981864} [2025-09-20 13:56:48] [INFO] [ai:140] llm-ad7274e6 LLM response model=gemma3:12b duration=0.164s summary=πŸ€”πŸ€¨ [2025-09-20 13:56:48] [DEBUG] [ai:142] llm-ad7274e6 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T17:56:48.206860269Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 161685896, 'load_duration': 119487200, 'prompt_eval_count': 118, 'prompt_eval_duration': 13840184, 'eval_count': 3, 'eval_duration': 27981864} [2025-09-20 13:56:48] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 13:56:49] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 13:56:49] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:01:11] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:01:11] [INFO] [bot:61] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:01:11] [INFO] [bot:63] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:01:11] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:01:11] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:01:11.295351637Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:01:11] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:01:11] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:01:11] [INFO] [bot:67] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:01:11] [INFO] [bot:70] βœ… Final model in use: gemma3:12b [2025-09-20 14:01:15] [INFO] [bot:232] Logged in as AI Bot [2025-09-20 14:01:15] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:01:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.29) [2025-09-20 14:01:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.15) [2025-09-20 14:01:19] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:01:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.18) [2025-09-20 14:01:19] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.29) [2025-09-20 14:01:19] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:01:19] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:01:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.77) [2025-09-20 14:01:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.36) [2025-09-20 14:01:23] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:01:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.43) [2025-09-20 14:01:23] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:01:24] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:01:24] [INFO] [ai:132] llm-c7294b9d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:01:24] [DEBUG] [ai:184] llm-c7294b9d Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:01:24] [DEBUG] [ai:185] llm-c7294b9d Payload size=405 chars [2025-09-20 14:01:27] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:01:27] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:01:27] [INFO] [ai:132] llm-ffe102ac LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:01:27] [DEBUG] [ai:184] llm-ffe102ac Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:01:27] [DEBUG] [ai:185] llm-ffe102ac Payload size=405 chars [2025-09-20 14:01:27] [DEBUG] [ai:193] llm-c7294b9d Raw response status=200 [2025-09-20 14:01:27] [DEBUG] [ai:194] llm-c7294b9d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:01:27.672872644Z","response":"πŸ˜‚βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,239927,239794],"total_duration":3528677539,"load_duration":3278126973,"prompt_eval_count":118,"prompt_eval_duration":202354037,"eval_count":3,"eval_duration":47494651} [2025-09-20 14:01:27] [INFO] [ai:140] llm-c7294b9d LLM response model=gemma3:12b duration=3.544s summary=πŸ˜‚βœ¨ [2025-09-20 14:01:27] [DEBUG] [ai:142] llm-c7294b9d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:01:27.672872644Z', 'response': 'πŸ˜‚βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 239927, 239794], 'total_duration': 3528677539, 'load_duration': 3278126973, 'prompt_eval_count': 118, 'prompt_eval_duration': 202354037, 'eval_count': 3, 'eval_duration': 47494651} [2025-09-20 14:01:27] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ˜‚βœ¨ [2025-09-20 14:01:27] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.54) [2025-09-20 14:01:27] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:01:27] [DEBUG] [ai:193] llm-ffe102ac Raw response status=200 [2025-09-20 14:01:27] [DEBUG] [ai:194] llm-ffe102ac Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:01:27.728281894Z","response":"πŸ€”πŸ’»","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,244862],"total_duration":374898909,"load_duration":162370622,"prompt_eval_count":118,"prompt_eval_duration":14473585,"eval_count":3,"eval_duration":27464169} [2025-09-20 14:01:27] [INFO] [ai:140] llm-ffe102ac LLM response model=gemma3:12b duration=0.380s summary=πŸ€”πŸ’» [2025-09-20 14:01:27] [DEBUG] [ai:142] llm-ffe102ac LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:01:27.728281894Z', 'response': 'πŸ€”πŸ’»', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 244862], 'total_duration': 374898909, 'load_duration': 162370622, 'prompt_eval_count': 118, 'prompt_eval_duration': 14473585, 'eval_count': 3, 'eval_duration': 27464169} [2025-09-20 14:01:27] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ’» [2025-09-20 14:01:27] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:01:27] [INFO] [ai:132] llm-a630fa8d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:01:27] [DEBUG] [ai:184] llm-a630fa8d Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:01:27] [DEBUG] [ai:185] llm-a630fa8d Payload size=405 chars [2025-09-20 14:01:28] [DEBUG] [ai:193] llm-a630fa8d Raw response status=200 [2025-09-20 14:01:28] [DEBUG] [ai:194] llm-a630fa8d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:01:28.104589032Z","response":"πŸ€” 🀨 😴","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,236743,248294,236743,246205],"total_duration":207411673,"load_duration":127064770,"prompt_eval_count":118,"prompt_eval_duration":13963293,"eval_count":6,"eval_duration":65915284} [2025-09-20 14:01:28] [INFO] [ai:140] llm-a630fa8d LLM response model=gemma3:12b duration=0.210s summary=πŸ€” 🀨 😴 [2025-09-20 14:01:28] [DEBUG] [ai:142] llm-a630fa8d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:01:28.104589032Z', 'response': 'πŸ€” 🀨 😴', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 236743, 248294, 236743, 246205], 'total_duration': 207411673, 'load_duration': 127064770, 'prompt_eval_count': 118, 'prompt_eval_duration': 13963293, 'eval_count': 6, 'eval_duration': 65915284} [2025-09-20 14:01:28] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€” 🀨 😴 [2025-09-20 14:01:28] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:01:28] [INFO] [bot:91] Command modfile on cooldown. Retry after 0.7 seconds. [2025-09-20 14:01:28] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.51) [2025-09-20 14:01:28] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:01:30] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:01:30] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:07:28] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:07:28] [INFO] [bot:61] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:07:28] [INFO] [bot:63] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:07:28] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:07:28] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:07:28.400071637Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:07:28] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:07:28] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:07:28] [INFO] [bot:67] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:07:28] [INFO] [bot:70] βœ… Final model in use: gemma3:12b [2025-09-20 14:07:31] [INFO] [bot:262] Logged in as AI Bot [2025-09-20 14:07:31] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:08:08] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.41) [2025-09-20 14:08:08] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.95) [2025-09-20 14:08:08] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:08:08] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.86) [2025-09-20 14:08:08] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.71) [2025-09-20 14:08:08] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:08:08] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:08:12] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.11) [2025-09-20 14:08:12] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.14) [2025-09-20 14:08:12] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:08:12] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.26) [2025-09-20 14:08:12] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.45) [2025-09-20 14:08:12] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:08:12] [INFO] [bot:91] Command modfile on cooldown. Retry after 6.0 seconds. [2025-09-20 14:08:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.76) [2025-09-20 14:08:16] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:08:16] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.85) [2025-09-20 14:08:16] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:08:16] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:08:16] [INFO] [ai:132] llm-cdb3a1e0 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:08:16] [DEBUG] [ai:184] llm-cdb3a1e0 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:08:16] [DEBUG] [ai:185] llm-cdb3a1e0 Payload size=405 chars [2025-09-20 14:08:20] [DEBUG] [ai:193] llm-cdb3a1e0 Raw response status=200 [2025-09-20 14:08:20] [DEBUG] [ai:194] llm-cdb3a1e0 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:08:19.997765034Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":3506857663,"load_duration":3248947067,"prompt_eval_count":118,"prompt_eval_duration":209487061,"eval_count":3,"eval_duration":48056613} [2025-09-20 14:08:20] [INFO] [ai:140] llm-cdb3a1e0 LLM response model=gemma3:12b duration=3.528s summary=πŸ€”πŸ€¨ [2025-09-20 14:08:20] [DEBUG] [ai:142] llm-cdb3a1e0 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:08:19.997765034Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 3506857663, 'load_duration': 3248947067, 'prompt_eval_count': 118, 'prompt_eval_duration': 209487061, 'eval_count': 3, 'eval_duration': 48056613} [2025-09-20 14:08:20] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 14:08:20] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:08:21] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:08:21] [INFO] [ai:132] llm-1f4b5e36 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:08:21] [DEBUG] [ai:184] llm-1f4b5e36 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:08:21] [DEBUG] [ai:185] llm-1f4b5e36 Payload size=405 chars [2025-09-20 14:08:21] [DEBUG] [ai:193] llm-1f4b5e36 Raw response status=200 [2025-09-20 14:08:21] [DEBUG] [ai:194] llm-1f4b5e36 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:08:21.322827156Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":175838784,"load_duration":134017926,"prompt_eval_count":118,"prompt_eval_duration":14102362,"eval_count":3,"eval_duration":27196875} [2025-09-20 14:08:21] [INFO] [ai:140] llm-1f4b5e36 LLM response model=gemma3:12b duration=0.178s summary=πŸ€”πŸ€¨ [2025-09-20 14:08:21] [DEBUG] [ai:142] llm-1f4b5e36 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:08:21.322827156Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 175838784, 'load_duration': 134017926, 'prompt_eval_count': 118, 'prompt_eval_duration': 14102362, 'eval_count': 3, 'eval_duration': 27196875} [2025-09-20 14:08:21] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 14:08:22] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:08:22] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:10:53] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.31) [2025-09-20 14:10:53] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.49) [2025-09-20 14:10:53] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:10:53] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:10:58] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.79) [2025-09-20 14:10:58] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.56) [2025-09-20 14:10:58] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:10:58] [INFO] [bot:91] Command modfile on cooldown. Retry after 5.0 seconds. [2025-09-20 14:11:01] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.39) [2025-09-20 14:11:01] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:11:01] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:11:01] [INFO] [ai:132] llm-02a70d4d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:11:01] [DEBUG] [ai:184] llm-02a70d4d Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:11:01] [DEBUG] [ai:185] llm-02a70d4d Payload size=405 chars [2025-09-20 14:11:01] [DEBUG] [ai:193] llm-02a70d4d Raw response status=200 [2025-09-20 14:11:01] [DEBUG] [ai:194] llm-02a70d4d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:11:01.892603292Z","response":"πŸ€” 🀨 😴","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,236743,248294,236743,246205],"total_duration":242035661,"load_duration":123466389,"prompt_eval_count":118,"prompt_eval_duration":40510527,"eval_count":6,"eval_duration":77606418} [2025-09-20 14:11:01] [INFO] [ai:140] llm-02a70d4d LLM response model=gemma3:12b duration=0.245s summary=πŸ€” 🀨 😴 [2025-09-20 14:11:01] [DEBUG] [ai:142] llm-02a70d4d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:11:01.892603292Z', 'response': 'πŸ€” 🀨 😴', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 236743, 248294, 236743, 246205], 'total_duration': 242035661, 'load_duration': 123466389, 'prompt_eval_count': 118, 'prompt_eval_duration': 40510527, 'eval_count': 6, 'eval_duration': 77606418} [2025-09-20 14:11:01] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€” 🀨 😴 [2025-09-20 14:11:03] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:11:03] [INFO] [bot:91] Command modfile on cooldown. Retry after 0.2 seconds. [2025-09-20 14:11:05] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.12) [2025-09-20 14:11:05] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.38) [2025-09-20 14:11:05] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:11:05] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:17:22] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:17:22] [INFO] [bot:62] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:17:22] [INFO] [bot:64] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:17:22] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:17:22] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:17:22.837592792Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:17:22] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:17:23] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:17:23] [INFO] [bot:68] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:17:23] [INFO] [bot:71] βœ… Final model in use: gemma3:12b [2025-09-20 14:17:25] [INFO] [bot:281] Logged in as AI Bot [2025-09-20 14:17:25] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:17:29] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.82) [2025-09-20 14:17:29] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.62) [2025-09-20 14:17:29] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:29] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:17:30] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:17:30] [INFO] [ai:132] llm-5124ca3a LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:17:30] [DEBUG] [ai:184] llm-5124ca3a Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:17:30] [DEBUG] [ai:185] llm-5124ca3a Payload size=405 chars [2025-09-20 14:17:32] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.55) [2025-09-20 14:17:32] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.69) [2025-09-20 14:17:32] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:33] [DEBUG] [ai:193] llm-5124ca3a Raw response status=200 [2025-09-20 14:17:33] [DEBUG] [ai:194] llm-5124ca3a Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:17:33.5277511Z","response":"πŸ‘€βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242509,239794],"total_duration":3487062480,"load_duration":3237295173,"prompt_eval_count":118,"prompt_eval_duration":201027253,"eval_count":3,"eval_duration":48123981} [2025-09-20 14:17:33] [INFO] [ai:140] llm-5124ca3a LLM response model=gemma3:12b duration=3.503s summary=πŸ‘€βœ¨ [2025-09-20 14:17:33] [DEBUG] [ai:142] llm-5124ca3a LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:17:33.5277511Z', 'response': 'πŸ‘€βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242509, 239794], 'total_duration': 3487062480, 'load_duration': 3237295173, 'prompt_eval_count': 118, 'prompt_eval_duration': 201027253, 'eval_count': 3, 'eval_duration': 48123981} [2025-09-20 14:17:33] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ‘€βœ¨ [2025-09-20 14:17:33] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:17:33] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:17:33] [INFO] [ai:132] llm-838542d2 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:17:33] [DEBUG] [ai:184] llm-838542d2 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:17:33] [DEBUG] [ai:185] llm-838542d2 Payload size=405 chars [2025-09-20 14:17:33] [DEBUG] [ai:193] llm-838542d2 Raw response status=200 [2025-09-20 14:17:33] [DEBUG] [ai:194] llm-838542d2 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:17:33.913050192Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":182474215,"load_duration":141028193,"prompt_eval_count":118,"prompt_eval_duration":13632404,"eval_count":3,"eval_duration":27383513} [2025-09-20 14:17:33] [INFO] [ai:140] llm-838542d2 LLM response model=gemma3:12b duration=0.185s summary=πŸ€”πŸ€¨ [2025-09-20 14:17:33] [DEBUG] [ai:142] llm-838542d2 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:17:33.913050192Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 182474215, 'load_duration': 141028193, 'prompt_eval_count': 118, 'prompt_eval_duration': 13632404, 'eval_count': 3, 'eval_duration': 27383513} [2025-09-20 14:17:33] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 14:17:34] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.93) [2025-09-20 14:17:34] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:34] [INFO] [bot:91] Command modfile info on cooldown. Retry after 10.0 seconds. [2025-09-20 14:17:35] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.24) [2025-09-20 14:17:35] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:35] [INFO] [bot:91] Command modfile on cooldown. Retry after 9.2 seconds. [2025-09-20 14:17:37] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.20) [2025-09-20 14:17:37] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 1.00) [2025-09-20 14:17:37] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:37] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.37) [2025-09-20 14:17:37] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.17) [2025-09-20 14:17:37] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:37] [INFO] [bot:91] Command modfile on cooldown. Retry after 6.5 seconds. [2025-09-20 14:17:41] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.14) [2025-09-20 14:17:41] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.23) [2025-09-20 14:17:41] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:17:41] [INFO] [bot:91] Command modfile on cooldown. Retry after 3.4 seconds. [2025-09-20 14:17:41] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:17:41] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:17:41] [INFO] [ai:132] llm-bfc9c877 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:17:41] [DEBUG] [ai:184] llm-bfc9c877 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:17:41] [DEBUG] [ai:185] llm-bfc9c877 Payload size=405 chars [2025-09-20 14:17:41] [DEBUG] [ai:193] llm-bfc9c877 Raw response status=200 [2025-09-20 14:17:41] [DEBUG] [ai:194] llm-bfc9c877 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:17:41.480434859Z","response":"πŸ€” 😢 😬","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,236743,247717,236743,244993],"total_duration":202103537,"load_duration":120133396,"prompt_eval_count":118,"prompt_eval_duration":14065140,"eval_count":6,"eval_duration":67544194} [2025-09-20 14:17:41] [INFO] [ai:140] llm-bfc9c877 LLM response model=gemma3:12b duration=0.205s summary=πŸ€” 😢 😬 [2025-09-20 14:17:41] [DEBUG] [ai:142] llm-bfc9c877 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:17:41.480434859Z', 'response': 'πŸ€” 😢 😬', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 236743, 247717, 236743, 244993], 'total_duration': 202103537, 'load_duration': 120133396, 'prompt_eval_count': 118, 'prompt_eval_duration': 14065140, 'eval_count': 6, 'eval_duration': 67544194} [2025-09-20 14:17:41] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€” 😢 😬 [2025-09-20 14:17:42] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.39) [2025-09-20 14:17:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:01] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:19:01] [INFO] [bot:62] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:19:01] [INFO] [bot:64] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:19:01] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:19:01] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:19:01.179228796Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:19:01] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:19:01] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:19:01] [INFO] [bot:68] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:19:01] [INFO] [bot:71] βœ… Final model in use: gemma3:12b [2025-09-20 14:19:04] [INFO] [bot:281] Logged in as AI Bot [2025-09-20 14:19:04] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:19:05] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.26) [2025-09-20 14:19:05] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.93) [2025-09-20 14:19:05] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:08] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.30) [2025-09-20 14:19:08] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.85) [2025-09-20 14:19:08] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.58) [2025-09-20 14:19:11] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.28) [2025-09-20 14:19:11] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.63) [2025-09-20 14:19:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.20) [2025-09-20 14:19:13] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:15] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.59) [2025-09-20 14:19:15] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.21) [2025-09-20 14:19:15] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:17] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:19:17] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:19:17] [INFO] [ai:132] llm-38dd7ca4 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:19:17] [DEBUG] [ai:184] llm-38dd7ca4 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:19:17] [DEBUG] [ai:185] llm-38dd7ca4 Payload size=405 chars [2025-09-20 14:19:21] [DEBUG] [ai:193] llm-38dd7ca4 Raw response status=200 [2025-09-20 14:19:21] [DEBUG] [ai:194] llm-38dd7ca4 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:19:20.985300681Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":3502278697,"load_duration":3257536949,"prompt_eval_count":118,"prompt_eval_duration":196630825,"eval_count":3,"eval_duration":47603398} [2025-09-20 14:19:21] [INFO] [ai:140] llm-38dd7ca4 LLM response model=gemma3:12b duration=3.519s summary=πŸ€”πŸ€¨ [2025-09-20 14:19:21] [DEBUG] [ai:142] llm-38dd7ca4 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:19:20.985300681Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 3502278697, 'load_duration': 3257536949, 'prompt_eval_count': 118, 'prompt_eval_duration': 196630825, 'eval_count': 3, 'eval_duration': 47603398} [2025-09-20 14:19:21] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 14:19:21] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:19:21] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:19:21] [INFO] [ai:132] llm-67ef8cc5 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:19:21] [DEBUG] [ai:184] llm-67ef8cc5 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:19:21] [DEBUG] [ai:185] llm-67ef8cc5 Payload size=405 chars [2025-09-20 14:19:21] [DEBUG] [ai:193] llm-67ef8cc5 Raw response status=200 [2025-09-20 14:19:21] [DEBUG] [ai:194] llm-67ef8cc5 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:19:21.383923448Z","response":"πŸ™„πŸ˜‚πŸ€¦β€β™‚οΈ","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,243810,239927,243789,121457],"total_duration":216503008,"load_duration":147117257,"prompt_eval_count":118,"prompt_eval_duration":14063269,"eval_count":5,"eval_duration":54900258} [2025-09-20 14:19:21] [INFO] [ai:140] llm-67ef8cc5 LLM response model=gemma3:12b duration=0.219s summary=πŸ™„πŸ˜‚πŸ€¦β€β™‚οΈ [2025-09-20 14:19:21] [DEBUG] [ai:142] llm-67ef8cc5 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:19:21.383923448Z', 'response': 'πŸ™„πŸ˜‚πŸ€¦\u200d♂️', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 243810, 239927, 243789, 121457], 'total_duration': 216503008, 'load_duration': 147117257, 'prompt_eval_count': 118, 'prompt_eval_duration': 14063269, 'eval_count': 5, 'eval_duration': 54900258} [2025-09-20 14:19:21] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ™„πŸ˜‚πŸ€¦β€β™‚οΈ [2025-09-20 14:19:22] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.26) [2025-09-20 14:19:22] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:19:23] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.18) [2025-09-20 14:19:23] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. nohup: ignoring input [2025-09-20 14:20:59] [INFO] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:20:59] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:20:59] [INFO] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:20:59] [INFO] [bot:62] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:20:59] [INFO] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:20:59] [INFO] [bot:64] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:20:59] [INFO] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:20:59] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:20:59] [INFO] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:20:59.414055505Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:20:59] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:20:59.414055505Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:20:59] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 14:20:59] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:20:59] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 14:20:59] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:20:59] [INFO] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:20:59] [INFO] [bot:68] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:20:59] [INFO] βœ… Final model in use: gemma3:12b [2025-09-20 14:20:59] [INFO] [bot:71] βœ… Final model in use: gemma3:12b [2025-09-20 14:20:59] [INFO ] discord.client: logging in using static token [2025-09-20 14:21:00] [INFO ] discord.gateway: Shard ID None has connected to Gateway (Session ID: 3e5c4e86f5b137dff3fc231c6ac29ca5). βœ… Logged in as AI Bot [2025-09-20 14:21:02] [INFO] Logged in as AI Bot [2025-09-20 14:21:02] [INFO] [bot:281] Logged in as AI Bot [2025-09-20 14:21:02] [INFO] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:21:02] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:21:29] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.99) [2025-09-20 14:21:29] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.22) [2025-09-20 14:21:29] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:29] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:32] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.90) [2025-09-20 14:21:32] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.43) [2025-09-20 14:21:32] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:32] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:34] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.71) [2025-09-20 14:21:34] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 14:21:34] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:21:35] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 14:21:35] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:21:35] [INFO] llm-93dae670 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:21:35] [INFO] [ai:132] llm-93dae670 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:21:35] [DEBUG] [ai:184] llm-93dae670 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:21:35] [DEBUG] [ai:185] llm-93dae670 Payload size=405 chars [2025-09-20 14:21:38] [DEBUG] [ai:193] llm-93dae670 Raw response status=200 [2025-09-20 14:21:38] [DEBUG] [ai:194] llm-93dae670 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:21:38.614117471Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":3493747463,"load_duration":3242303444,"prompt_eval_count":118,"prompt_eval_duration":203903072,"eval_count":3,"eval_duration":47130599} [2025-09-20 14:21:38] [INFO] llm-93dae670 LLM response model=gemma3:12b duration=3.509s summary=πŸ€”πŸ€¨ [2025-09-20 14:21:38] [INFO] [ai:140] llm-93dae670 LLM response model=gemma3:12b duration=3.509s summary=πŸ€”πŸ€¨ [2025-09-20 14:21:38] [DEBUG] [ai:142] llm-93dae670 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:21:38.614117471Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 3493747463, 'load_duration': 3242303444, 'prompt_eval_count': 118, 'prompt_eval_duration': 203903072, 'eval_count': 3, 'eval_duration': 47130599} [2025-09-20 14:21:38] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 14:21:39] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:39] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:41] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.92) [2025-09-20 14:21:41] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 14:21:41] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:21:41] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 14:21:41] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:21:41] [INFO] llm-68d60731 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:21:41] [INFO] [ai:132] llm-68d60731 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:21:41] [DEBUG] [ai:184] llm-68d60731 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:21:41] [DEBUG] [ai:185] llm-68d60731 Payload size=405 chars [2025-09-20 14:21:41] [DEBUG] [ai:193] llm-68d60731 Raw response status=200 [2025-09-20 14:21:41] [DEBUG] [ai:194] llm-68d60731 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:21:41.828356105Z","response":"πŸ€”πŸ’»","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,244862],"total_duration":187608389,"load_duration":145998816,"prompt_eval_count":118,"prompt_eval_duration":14069005,"eval_count":3,"eval_duration":27205814} [2025-09-20 14:21:41] [INFO] llm-68d60731 LLM response model=gemma3:12b duration=0.190s summary=πŸ€”πŸ’» [2025-09-20 14:21:41] [INFO] [ai:140] llm-68d60731 LLM response model=gemma3:12b duration=0.190s summary=πŸ€”πŸ’» [2025-09-20 14:21:41] [DEBUG] [ai:142] llm-68d60731 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:21:41.828356105Z', 'response': 'πŸ€”πŸ’»', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 244862], 'total_duration': 187608389, 'load_duration': 145998816, 'prompt_eval_count': 118, 'prompt_eval_duration': 14069005, 'eval_count': 3, 'eval_duration': 27205814} [2025-09-20 14:21:41] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ’» [2025-09-20 14:21:42] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:21:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:46:17] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:46:17] [INFO] [bot:62] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:46:17] [INFO] [bot:64] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:46:17] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:46:17] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:46:17.621223308Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:46:17] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:46:17] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:46:17] [INFO] [bot:68] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:46:17] [INFO] [bot:71] βœ… Final model in use: gemma3:12b [2025-09-20 14:46:20] [INFO] [bot:281] Logged in as AI Bot [2025-09-20 14:46:20] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.57) [2025-09-20 14:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.28) [2025-09-20 14:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.10) [2025-09-20 14:46:30] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.69) [2025-09-20 14:46:30] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:46:30] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:46:30] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:46:35] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.94) [2025-09-20 14:46:35] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.67) [2025-09-20 14:46:35] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:46:35] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.80) [2025-09-20 14:46:35] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.49) [2025-09-20 14:46:35] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:46:35] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:48:08] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session 3e5c4e86f5b137dff3fc231c6ac29ca5. nohup: ignoring input [2025-09-20 14:48:32] [INFO] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:48:32] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:48:32] [INFO] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:48:32] [INFO] [bot:65] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:48:32] [INFO] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:48:32] [INFO] [bot:67] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:48:32] [INFO] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:48:32] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:48:32] [INFO] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:48:32.380324129Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:48:32] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:48:32.380324129Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:48:32] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 14:48:32] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:48:32] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 14:48:32] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:48:32] [INFO] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:48:32] [INFO] [bot:71] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:48:32] [INFO] βœ… Final model in use: gemma3:12b [2025-09-20 14:48:32] [INFO] [bot:74] βœ… Final model in use: gemma3:12b [2025-09-20 14:48:32] [INFO ] discord.client: logging in using static token [2025-09-20 14:48:33] [INFO ] discord.gateway: Shard ID None has connected to Gateway (Session ID: 9841965c98ac0727a8f1fff8f27dbde3). βœ… Logged in as AI Bot [2025-09-20 14:48:35] [INFO] Logged in as AI Bot [2025-09-20 14:48:35] [INFO] [bot:284] Logged in as AI Bot [2025-09-20 14:48:35] [INFO] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:48:35] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:48:39] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.29) [2025-09-20 14:48:39] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.43) [2025-09-20 14:48:39] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:48:39] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:48:45] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.55) [2025-09-20 14:48:45] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.94) [2025-09-20 14:48:45] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:48:45] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:49:25] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:49:25] [INFO] [bot:65] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:49:25] [INFO] [bot:67] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:49:25] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:49:25] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:49:25.288430651Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:49:25] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:49:25] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:49:25] [INFO] [bot:71] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:49:25] [INFO] [bot:74] βœ… Final model in use: gemma3:12b [2025-09-20 14:49:28] [INFO] [bot:284] Logged in as AI Bot [2025-09-20 14:49:28] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:49:31] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.41) [2025-09-20 14:49:31] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.50) [2025-09-20 14:49:31] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:49:31] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.37) [2025-09-20 14:49:31] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.55) [2025-09-20 14:49:31] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:49:31] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. nohup: ignoring input [2025-09-20 14:50:29] [INFO] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:50:29] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 14:50:29] [INFO] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:50:29] [INFO] [bot:65] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 14:50:29] [INFO] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:50:29] [INFO] [bot:67] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 14:50:29] [INFO] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:50:29] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 14:50:29] [INFO] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:50:29.064866391Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:50:29] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T18:50:29.064866391Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 14:50:29] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 14:50:29] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:50:29] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 14:50:29] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:50:29] [INFO] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:50:29] [INFO] [bot:71] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 14:50:29] [INFO] βœ… Final model in use: gemma3:12b [2025-09-20 14:50:29] [INFO] [bot:74] βœ… Final model in use: gemma3:12b [2025-09-20 14:50:29] [INFO ] discord.client: logging in using static token [2025-09-20 14:50:29] [INFO ] discord.gateway: Shard ID None has connected to Gateway (Session ID: 2574d17b97e34954eed7eec44ce31576). βœ… Logged in as AI Bot [2025-09-20 14:50:31] [INFO] Logged in as AI Bot [2025-09-20 14:50:31] [INFO] [bot:284] Logged in as AI Bot [2025-09-20 14:50:31] [INFO] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:50:31] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 14:53:57] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 14:53:57] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 14:53:57] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 14:53:57] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 14:53:57] [INFO] llm-3c80482f LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:53:57] [INFO] [ai:132] llm-3c80482f LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 14:53:57] [DEBUG] [ai:184] llm-3c80482f Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 14:53:57] [DEBUG] [ai:185] llm-3c80482f Payload size=405 chars [2025-09-20 14:54:02] [DEBUG] [ai:193] llm-3c80482f Raw response status=200 [2025-09-20 14:54:02] [DEBUG] [ai:194] llm-3c80482f Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T18:54:02.209515032Z","response":"πŸ€”πŸ€¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,248294],"total_duration":4516258539,"load_duration":4259336523,"prompt_eval_count":118,"prompt_eval_duration":205447411,"eval_count":3,"eval_duration":51017445} [2025-09-20 14:54:02] [INFO] llm-3c80482f LLM response model=gemma3:12b duration=4.531s summary=πŸ€”πŸ€¨ [2025-09-20 14:54:02] [INFO] [ai:140] llm-3c80482f LLM response model=gemma3:12b duration=4.531s summary=πŸ€”πŸ€¨ [2025-09-20 14:54:02] [DEBUG] [ai:142] llm-3c80482f LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T18:54:02.209515032Z', 'response': 'πŸ€”πŸ€¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 248294], 'total_duration': 4516258539, 'load_duration': 4259336523, 'prompt_eval_count': 118, 'prompt_eval_duration': 205447411, 'eval_count': 3, 'eval_duration': 51017445} [2025-09-20 14:54:02] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ€¨ [2025-09-20 14:54:03] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.83) [2025-09-20 14:54:03] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:54:03] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 14:54:03] [DEBUG] [bot:361] modfile_info invoked: cmd=modfile info user=161149541171593216 msg=1419033863518421166 [2025-09-20 14:54:03] [DEBUG] [bot:382] modfile_info: preparing to send reply for msg=1419033863518421166 [2025-09-20 14:54:03] [DEBUG] [bot:398] modfile_info: sent payload for msg=1419033863518421166 nohup: ignoring input [2025-09-20 15:00:12] [INFO] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 15:00:12] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 15:00:12] [INFO] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 15:00:12] [INFO] [bot:65] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 15:00:12] [INFO] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 15:00:12] [INFO] [bot:67] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 15:00:12] [INFO] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 15:00:12] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 15:00:12] [INFO] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T19:00:12.70420662Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 15:00:12] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T19:00:12.70420662Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 15:00:12] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 15:00:12] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 15:00:12] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 15:00:12] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 15:00:12] [INFO] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 15:00:12] [INFO] [bot:71] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 15:00:12] [INFO] βœ… Final model in use: gemma3:12b [2025-09-20 15:00:12] [INFO] [bot:74] βœ… Final model in use: gemma3:12b [2025-09-20 15:00:12] [INFO ] discord.client: logging in using static token [2025-09-20 15:00:14] [INFO ] discord.gateway: Shard ID None has connected to Gateway (Session ID: eed4b6aaccccd97b5456c73e342e25a1). βœ… Logged in as AI Bot [2025-09-20 15:00:16] [INFO] Logged in as AI Bot [2025-09-20 15:00:16] [INFO] [bot:289] Logged in as AI Bot [2025-09-20 15:00:16] [INFO] πŸ›‘ Scheduler disabled in config. [2025-09-20 15:00:16] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 15:00:27] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.99) [2025-09-20 15:00:27] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.70) [2025-09-20 15:00:27] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:00:27] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:00:28] [DEBUG] [bot:366] modfile_info invoked: cmd=modfile info user=161149541171593216 msg=1419035499590451321 [2025-09-20 15:00:28] [DEBUG] [bot:387] modfile_info: preparing to send reply for msg=1419035499590451321 [2025-09-20 15:00:28] [DEBUG] [bot:243] on_message: observed own message id=1419035502119489546 channel=1380999713272238151 [2025-09-20 15:00:28] [DEBUG] [bot:406] modfile_info: sent payload for msg=1419035499590451321 -> sent_id=1419035502119489546 channel=1380999713272238151 [2025-09-20 15:00:34] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.20) [2025-09-20 15:00:34] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.26) [2025-09-20 15:00:34] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:00:34] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:00:34] [DEBUG] [bot:366] modfile_info invoked: cmd=modfile info user=161149541171593216 msg=1419035528778748005 [2025-09-20 15:00:34] [DEBUG] [bot:387] modfile_info: preparing to send reply for msg=1419035528778748005 [2025-09-20 15:00:34] [DEBUG] [bot:243] on_message: observed own message id=1419035529994833921 channel=1380999713272238151 [2025-09-20 15:00:34] [DEBUG] [bot:406] modfile_info: sent payload for msg=1419035528778748005 -> sent_id=1419035529994833921 channel=1380999713272238151 [2025-09-20 15:04:01] [INFO] [ai:40] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 15:04:01] [INFO] [bot:65] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-09-20 15:04:01] [INFO] [bot:67] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-09-20 15:04:01] [INFO] [ai:113] 🧹 Sending safe unload request for `gemma3:12b` [2025-09-20 15:04:01] [INFO] [ai:120] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-09-20T19:04:01.060550365Z","response":"","done":true,"done_reason":"unload"} [2025-09-20 15:04:01] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:01] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:01] [INFO] [bot:71] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-09-20 15:04:01] [INFO] [bot:74] βœ… Final model in use: gemma3:12b [2025-09-20 15:04:04] [INFO] [bot:289] Logged in as AI Bot [2025-09-20 15:04:04] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-09-20 15:04:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.85) [2025-09-20 15:04:13] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.93) [2025-09-20 15:04:13] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:04:13] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:13] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:13] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:13] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:13] [INFO] llm-9d680449 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 15:04:13] [INFO] [ai:132] llm-9d680449 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 15:04:13] [DEBUG] [ai:184] llm-9d680449 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 15:04:13] [DEBUG] [ai:185] llm-9d680449 Payload size=405 chars [2025-09-20 15:04:13] [DEBUG] [bot:366] modfile_info invoked: cmd=modfile info user=161149541171593216 msg=1419036447511412856 [2025-09-20 15:04:13] [DEBUG] [bot:387] modfile_info: preparing to send reply for msg=1419036447511412856 [2025-09-20 15:04:13] [DEBUG] [bot:243] on_message: observed own message id=1419036449256243210 channel=1380999713272238151 [2025-09-20 15:04:14] [DEBUG] [bot:406] modfile_info: sent payload for msg=1419036447511412856 -> sent_id=1419036449256243210 channel=1380999713272238151 [2025-09-20 15:04:17] [DEBUG] [ai:193] llm-9d680449 Raw response status=200 [2025-09-20 15:04:17] [DEBUG] [ai:194] llm-9d680449 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T19:04:17.308741965Z","response":"πŸ˜‚βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,8283,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,239927,239794],"total_duration":3572151729,"load_duration":3317234706,"prompt_eval_count":118,"prompt_eval_duration":204624732,"eval_count":3,"eval_duration":49820406} [2025-09-20 15:04:17] [INFO] llm-9d680449 LLM response model=gemma3:12b duration=3.588s summary=πŸ˜‚βœ¨ [2025-09-20 15:04:17] [INFO] [ai:140] llm-9d680449 LLM response model=gemma3:12b duration=3.588s summary=πŸ˜‚βœ¨ [2025-09-20 15:04:17] [DEBUG] [ai:142] llm-9d680449 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T19:04:17.308741965Z', 'response': 'πŸ˜‚βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 8283, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 239927, 239794], 'total_duration': 3572151729, 'load_duration': 3317234706, 'prompt_eval_count': 118, 'prompt_eval_duration': 204624732, 'eval_count': 3, 'eval_duration': 49820406} [2025-09-20 15:04:17] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ˜‚βœ¨ [2025-09-20 15:04:17] [DEBUG] [bot:243] on_message: observed own message id=1419036449256243210 channel=1380999713272238151 [2025-09-20 15:04:17] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.52) [2025-09-20 15:04:17] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:04:17] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:04:17] [DEBUG] [bot:366] modfile_info invoked: cmd=modfile info user=161149541171593216 msg=1419036447511412856 [2025-09-20 15:04:17] [DEBUG] [bot:387] modfile_info: preparing to send reply for msg=1419036447511412856 [2025-09-20 15:04:17] [DEBUG] [bot:243] on_message: observed own message id=1419036465698050200 channel=1380999713272238151 [2025-09-20 15:04:17] [DEBUG] [bot:243] on_message: observed own message id=1419036465698050200 channel=1380999713272238151 [2025-09-20 15:04:17] [DEBUG] [bot:406] modfile_info: sent payload for msg=1419036447511412856 -> sent_id=1419036465698050200 channel=1380999713272238151 [2025-09-20 15:04:24] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:25] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:25] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:25] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:25] [INFO] [ai:132] llm-d00b278c LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 15:04:25] [DEBUG] [ai:184] llm-d00b278c Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 15:04:25] [DEBUG] [ai:185] llm-d00b278c Payload size=405 chars [2025-09-20 15:04:25] [INFO] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:25] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:25] [INFO] llm-4e53403d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 15:04:25] [INFO] [ai:132] llm-4e53403d LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 15:04:25] [DEBUG] [ai:184] llm-4e53403d Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 15:04:25] [DEBUG] [ai:185] llm-4e53403d Payload size=405 chars [2025-09-20 15:04:25] [DEBUG] [ai:193] llm-d00b278c Raw response status=200 [2025-09-20 15:04:25] [DEBUG] [ai:194] llm-d00b278c Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T19:04:25.359175507Z","response":"πŸ€”πŸ‘€","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,1694,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,242509],"total_duration":218701295,"load_duration":144673481,"prompt_eval_count":118,"prompt_eval_duration":43918420,"eval_count":3,"eval_duration":29769207} [2025-09-20 15:04:25] [INFO] [ai:140] llm-d00b278c LLM response model=gemma3:12b duration=0.221s summary=πŸ€”πŸ‘€ [2025-09-20 15:04:25] [DEBUG] [ai:142] llm-d00b278c LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T19:04:25.359175507Z', 'response': 'πŸ€”πŸ‘€', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 1694, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 242509], 'total_duration': 218701295, 'load_duration': 144673481, 'prompt_eval_count': 118, 'prompt_eval_duration': 43918420, 'eval_count': 3, 'eval_duration': 29769207} [2025-09-20 15:04:25] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ‘€ [2025-09-20 15:04:25] [DEBUG] [ai:193] llm-4e53403d Raw response status=200 [2025-09-20 15:04:25] [DEBUG] [ai:194] llm-4e53403d Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T19:04:25.449207015Z","response":"πŸ€”πŸ’»","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,1694,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,244862],"total_duration":174373541,"load_duration":131544684,"prompt_eval_count":118,"prompt_eval_duration":14092238,"eval_count":3,"eval_duration":28323536} [2025-09-20 15:04:25] [INFO] llm-4e53403d LLM response model=gemma3:12b duration=0.177s summary=πŸ€”πŸ’» [2025-09-20 15:04:25] [INFO] [ai:140] llm-4e53403d LLM response model=gemma3:12b duration=0.177s summary=πŸ€”πŸ’» [2025-09-20 15:04:25] [DEBUG] [ai:142] llm-4e53403d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T19:04:25.449207015Z', 'response': 'πŸ€”πŸ’»', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 1694, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 244862], 'total_duration': 174373541, 'load_duration': 131544684, 'prompt_eval_count': 118, 'prompt_eval_duration': 14092238, 'eval_count': 3, 'eval_duration': 28323536} [2025-09-20 15:04:25] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”πŸ’» [2025-09-20 15:04:25] [WARNING ] discord.http: We are being rate limited. PUT https://discord.com/api/v10/channels/1380999713272238151/messages/1419036495280607365/reactions/%F0%9F%A4%94/@me responded with 429. Retrying in 0.30 seconds. [2025-09-20 15:04:26] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.95) [2025-09-20 15:04:26] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:04:26] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:04:26] [DEBUG] [bot:243] on_message: observed own message id=1419036502838612049 channel=1380999713272238151 [2025-09-20 15:04:26] [DEBUG] [bot:243] on_message: observed own message id=1419036502838612049 channel=1380999713272238151 [2025-09-20 15:04:27] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-20 15:04:27] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-20 15:04:27] [INFO] [ai:132] llm-6f15e707 LLM request start model=gemma3:12b user=- context_len=0 [2025-09-20 15:04:27] [DEBUG] [ai:184] llm-6f15e707 Sending payload to Ollama: model=gemma3:12b user=None [2025-09-20 15:04:27] [DEBUG] [ai:185] llm-6f15e707 Payload size=405 chars [2025-09-20 15:04:27] [DEBUG] [ai:193] llm-6f15e707 Raw response status=200 [2025-09-20 15:04:27] [DEBUG] [ai:194] llm-6f15e707 Raw response body={"model":"gemma3:12b","created_at":"2025-09-20T19:04:27.379077005Z","response":"πŸ€”βœ¨","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1599,659,22616,236764,614,73296,2872,1015,6178,111730,531,9434,531,2430,10396,236761,17770,496,3618,236764,994,44917,236743,236770,236964,236800,22690,111730,528,3072,531,506,15737,653,3004,236761,2301,1816,236761,108,2887,236787,1717,1901,2164,1694,107,4722,236787,107,236820,236909,111457,111038,106,107,105,4368,107,242846,239794],"total_duration":165308303,"load_duration":121792655,"prompt_eval_count":118,"prompt_eval_duration":14095611,"eval_count":3,"eval_duration":29071368} [2025-09-20 15:04:27] [INFO] [ai:140] llm-6f15e707 LLM response model=gemma3:12b duration=0.168s summary=πŸ€”βœ¨ [2025-09-20 15:04:27] [DEBUG] [ai:142] llm-6f15e707 LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-20T19:04:27.379077005Z', 'response': 'πŸ€”βœ¨', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1599, 659, 22616, 236764, 614, 73296, 2872, 1015, 6178, 111730, 531, 9434, 531, 2430, 10396, 236761, 17770, 496, 3618, 236764, 994, 44917, 236743, 236770, 236964, 236800, 22690, 111730, 528, 3072, 531, 506, 15737, 653, 3004, 236761, 2301, 1816, 236761, 108, 2887, 236787, 1717, 1901, 2164, 1694, 107, 4722, 236787, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 242846, 239794], 'total_duration': 165308303, 'load_duration': 121792655, 'prompt_eval_count': 118, 'prompt_eval_duration': 14095611, 'eval_count': 3, 'eval_duration': 29071368} [2025-09-20 15:04:27] [DEBUG] [autochat:113] 🎭 Emoji suggestion from LLM: πŸ€”βœ¨ [2025-09-20 15:04:27] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-20 15:04:27] [DEBUG] [bot:243] on_message: observed own message id=1419036507569786931 channel=1380999713272238151 [2025-09-20 15:04:27] [DEBUG] [bot:243] on_message: observed own message id=1419036507569786931 channel=1380999713272238151 [2025-09-20 15:23:13] [INFO] [ai:69] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 15:30:27] [INFO] [ai:69] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-09-20 17:25:57] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-20 18:56:21] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-20 21:10:39] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-20 23:45:51] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 00:14:15] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 00:14:15] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 00:14:15] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 03:20:46] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 06:16:09] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 06:54:20] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 09:49:43] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 09:49:43] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 10:36:38] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 10:51:44] [INFO] 🎯 Trigger word detected β€” Delta fully engaged. [2025-09-21 10:51:44] [INFO] [autochat:151] 🎯 Trigger word detected β€” Delta fully engaged. [2025-09-21 10:51:44] [INFO] πŸ€– Considering passive reply (author: PLEX) [2025-09-21 10:51:44] [INFO] [autochat:167] πŸ€– Considering passive reply (author: PLEX) [2025-09-21 10:51:44] [INFO] πŸ“š Retrieved 0 messages for context [2025-09-21 10:51:44] [INFO] [autochat:168] πŸ“š Retrieved 0 messages for context [2025-09-21 10:51:44] [INFO] 🧠 Preloading model: gemma3:12b [2025-09-21 10:51:44] [INFO] [ai:93] 🧠 Preloading model: gemma3:12b [2025-09-21 10:51:45] [INFO] πŸ“¦ Model pull started successfully. [2025-09-21 10:51:45] [INFO] [ai:100] πŸ“¦ Model pull started successfully. [2025-09-21 10:51:45] [INFO] llm-4b1d8b1d LLM request start model=gemma3:12b user=PLEX context_len=0 [2025-09-21 10:51:45] [INFO] [ai:132] llm-4b1d8b1d LLM request start model=gemma3:12b user=PLEX context_len=0 [2025-09-21 10:51:45] [DEBUG] [ai:184] llm-4b1d8b1d Sending payload to Ollama: model=gemma3:12b user=PLEX [2025-09-21 10:51:45] [DEBUG] [ai:185] llm-4b1d8b1d Payload size=292 chars [2025-09-21 10:51:55] [WARNING ] discord.gateway: Shard ID None heartbeat blocked for more than 10 seconds. Loop thread traceback (most recent call last): File "/home/milo/Documents/AI-Discord-Bot/src/bot.py", line 546, in bot.run(TOKEN) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/discord/client.py", line 929, in run asyncio.run(runner()) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/asyncio/runners.py", line 190, in run return runner.run(main) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/asyncio/base_events.py", line 641, in run_until_complete self.run_forever() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/asyncio/base_events.py", line 608, in run_forever self._run_once() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/asyncio/base_events.py", line 1936, in _run_once handle._run() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/asyncio/events.py", line 84, in _run self._context.run(self._callback, *self._args) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/discord/client.py", line 504, in _run_event await coro(*args, **kwargs) File "/home/milo/Documents/AI-Discord-Bot/src/bot.py", line 258, in on_message reply = await generate_auto_reply(message, bot) File "/home/milo/Documents/AI-Discord-Bot/src/autochat.py", line 171, in generate_auto_reply reply = get_ai_response( File "/home/milo/Documents/AI-Discord-Bot/src/ai.py", line 190, in get_ai_response return resp.status_code == 200 File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/requests/api.py", line 115, in post return request("post", url, data=data, json=json, **kwargs) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, **kwargs) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/requests/adapters.py", line 644, in send resp = conn.urlopen( File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/urllib3/connectionpool.py", line 787, in urlopen response = self._make_request( File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/urllib3/connectionpool.py", line 534, in _make_request response = conn.getresponse() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/site-packages/urllib3/connection.py", line 565, in getresponse httplib_response = super().getresponse() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/http/client.py", line 1395, in getresponse response.begin() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/http/client.py", line 325, in begin version, status, reason = self._read_status() File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/http/client.py", line 286, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/home/milo/miniconda3/envs/discord-bot/lib/python3.11/socket.py", line 718, in readinto return self._sock.recv_into(b) [2025-09-21 10:51:57] [DEBUG] [ai:193] llm-4b1d8b1d Raw response status=200 [2025-09-21 10:51:57] [DEBUG] [ai:194] llm-4b1d8b1d Raw response body={"model":"gemma3:12b","created_at":"2025-09-21T14:51:57.90355062Z","response":"Seriously? You're getting excited about *that*? A Plex add-on? Look, I appreciate you sharing, but my tastes tend to run a little more… refined. Like, maybe a flawlessly executed Domain Expansion, you know? \n\nBut hey, if it makes you happy, that's cool. Just don't expect me to start obsessing over it. Though... \"Epicfighter,\" huh? Does it, like, *fight* things? Maybe it’s got some hidden potential. I'm always interested in things with potential. πŸ˜‰\n\nDon’t go getting all flustered now. Just trying to be friendly. Now, tell me something *truly* interesting.","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,558,1474,115274,568,43175,236768,251,107,236836,19221,75591,236764,48349,97639,753,175625,75591,236764,48349,97639,691,5569,3742,531,127610,236761,107,236820,236909,111457,111038,106,107,105,4368,107,155420,236881,1599,236789,500,3978,9775,1003,808,7705,236829,236881,562,127610,1138,236772,498,236881,11696,236764,564,14756,611,11211,236764,840,1041,38527,6316,531,1845,496,2268,919,237064,33093,236761,9929,236764,7463,496,174109,19284,49822,81909,236764,611,1281,236881,236743,108,4573,31251,236764,768,625,3590,611,5293,236764,600,236789,236751,5427,236761,5393,1537,236789,236745,2414,786,531,1502,88751,522,1024,625,236761,20043,1390,623,130254,97639,2098,78345,236881,14300,625,236764,1133,236764,808,74364,236829,2432,236881,15704,625,236858,236751,2506,1070,11497,3435,236761,564,236789,236757,2462,7467,528,2432,607,3435,236761,85345,108,8993,236858,236745,817,3978,784,1378,707,3747,1492,236761,5393,4875,531,577,10841,236761,4224,236764,3442,786,2613,808,1168,3623,236829,6121,236761],"total_duration":12671648586,"load_duration":10067437581,"prompt_eval_count":95,"prompt_eval_duration":602342069,"eval_count":146,"eval_duration":2001253770} [2025-09-21 10:51:57] [INFO] llm-4b1d8b1d LLM response model=gemma3:12b duration=12.684s summary=Seriously? You're getting excited about *that*? A Plex add-on? Look, I appreciate you sharing, but my tastes tend to run a little more… refined. Like, maybe a f [2025-09-21 10:51:57] [INFO] [ai:140] llm-4b1d8b1d LLM response model=gemma3:12b duration=12.684s summary=Seriously? You're getting excited about *that*? A Plex add-on? Look, I appreciate you sharing, but my tastes tend to run a little more… refined. Like, maybe a f [2025-09-21 10:51:57] [DEBUG] [ai:142] llm-4b1d8b1d LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-09-21T14:51:57.90355062Z', 'response': 'Seriously? You\'re getting excited about *that*? A Plex add-on? Look, I appreciate you sharing, but my tastes tend to run a little more… refined. Like, maybe a flawlessly executed Domain Expansion, you know? \n\nBut hey, if it makes you happy, that\'s cool. Just don\'t expect me to start obsessing over it. Though... "Epicfighter," huh? Does it, like, *fight* things? Maybe it’s got some hidden potential. I\'m always interested in things with potential. πŸ˜‰\n\nDon’t go getting all flustered now. Just trying to be friendly. Now, tell me something *truly* interesting.', 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 558, 1474, 115274, 568, 43175, 236768, 251, 107, 236836, 19221, 75591, 236764, 48349, 97639, 753, 175625, 75591, 236764, 48349, 97639, 691, 5569, 3742, 531, 127610, 236761, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 155420, 236881, 1599, 236789, 500, 3978, 9775, 1003, 808, 7705, 236829, 236881, 562, 127610, 1138, 236772, 498, 236881, 11696, 236764, 564, 14756, 611, 11211, 236764, 840, 1041, 38527, 6316, 531, 1845, 496, 2268, 919, 237064, 33093, 236761, 9929, 236764, 7463, 496, 174109, 19284, 49822, 81909, 236764, 611, 1281, 236881, 236743, 108, 4573, 31251, 236764, 768, 625, 3590, 611, 5293, 236764, 600, 236789, 236751, 5427, 236761, 5393, 1537, 236789, 236745, 2414, 786, 531, 1502, 88751, 522, 1024, 625, 236761, 20043, 1390, 623, 130254, 97639, 2098, 78345, 236881, 14300, 625, 236764, 1133, 236764, 808, 74364, 236829, 2432, 236881, 15704, 625, 236858, 236751, 2506, 1070, 11497, 3435, 236761, 564, 236789, 236757, 2462, 7467, 528, 2432, 607, 3435, 236761, 85345, 108, 8993, 236858, 236745, 817, 3978, 784, 1378, 707, 3747, 1492, 236761, 5393, 4875, 531, 577, 10841, 236761, 4224, 236764, 3442, 786, 2613, 808, 1168, 3623, 236829, 6121, 236761], 'total_duration': 12671648586, 'load_duration': 10067437581, 'prompt_eval_count': 95, 'prompt_eval_duration': 602342069, 'eval_count': 146, 'eval_duration': 2001253770} [2025-09-21 10:51:58] [DEBUG] [bot:243] on_message: observed own message id=1419335352275046534 channel=1370422222900035834 [2025-09-21 11:02:34] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 11:24:52] [INFO] πŸ“‰ Engagement decayed by 4.97, new score: 0.00 [2025-09-21 11:24:52] [INFO] [autochat:72] πŸ“‰ Engagement decayed by 4.97, new score: 0.00 [2025-09-21 11:24:52] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 11:24:52] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 13:53:00] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 14:14:42] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 14:14:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-21 17:49:42] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 20:06:03] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-21 22:12:35] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 00:13:17] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 00:37:30] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 01:36:43] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 02:36:46] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 03:34:17] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 04:37:30] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 07:02:18] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 07:47:37] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 07:47:37] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 07:49:26] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 10:03:32] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 11:04:18] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 11:10:42] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 11:26:29] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 11:26:29] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 12:23:43] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 12:23:43] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 14:05:06] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 17:10:17] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 19:41:15] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 22:11:56] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-22 22:52:48] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-22 22:52:48] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-23 02:09:31] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 04:38:52] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 04:40:02] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 06:16:50] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 07:01:26] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 08:06:48] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 08:55:06] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 10:34:38] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 12:26:12] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 13:02:50] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 13:59:03] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 14:28:51] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 14:57:40] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 15:38:37] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 17:41:38] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-23 20:39:12] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 00:27:45] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 02:17:48] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 02:32:45] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 03:35:31] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 04:35:28] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-24 04:35:28] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-24 05:47:42] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 06:29:59] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 07:57:09] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 10:42:52] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 11:18:29] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 12:04:05] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 12:23:24] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 12:49:12] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 15:09:15] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 17:04:23] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 18:18:21] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-24 18:18:21] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-24 20:33:34] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 21:36:46] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 22:41:17] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-24 23:59:24] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 02:25:55] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-25 02:25:55] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-25 02:33:19] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 04:17:56] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 07:55:57] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 08:54:52] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-25 08:54:52] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-25 11:41:09] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 15:29:36] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 16:18:12] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 19:53:29] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 22:05:52] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 23:27:46] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-25 23:58:21] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 02:42:06] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 04:34:48] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 05:31:11] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 06:03:12] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 07:18:08] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 07:53:59] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 07:53:59] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 09:09:22] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 10:27:28] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 11:26:58] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 11:45:11] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 11:45:11] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 12:43:23] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 13:03:55] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 13:08:57] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 13:29:44] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 14:30:16] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 15:20:03] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 15:41:41] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 17:18:05] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 17:55:25] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 19:33:49] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 19:33:49] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 20:22:28] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-26 20:22:55] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 20:22:55] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-26 23:49:29] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-27 03:04:51] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-27 03:57:04] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-27 04:48:46] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-27 07:06:34] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-27 10:40:30] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-09-27 10:54:42] [INFO] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-27 10:54:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-09-27 12:38:20] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1. [2025-10-10 12:56:05] [INFO] [database:325] Connected to JSON backend [2025-10-10 12:56:05] [INFO] [database:539] Initialized JSON database backend [2025-10-10 12:56:05] [INFO] [migration:147] Initializing database system... [2025-10-10 12:56:24] [INFO] [database:325] Connected to JSON backend [2025-10-10 12:56:24] [INFO] [database:539] Initialized JSON database backend [2025-10-10 12:56:24] [INFO] [migration:147] Initializing database system... [2025-10-10 12:56:24] [INFO] [migration:156] Starting migration process... [2025-10-10 12:56:24] [ERROR] [migration:45] Failed to migrate user profiles: 'DatabaseManager' object has no attribute 'store_user_profile' [2025-10-10 12:56:24] [INFO] [migration:87] Migrated 0 conversation memories and 0 user memories [2025-10-10 12:56:24] [INFO] [migration:92] Backed up original file to src/memory.json.backup.20251010_125624 [2025-10-10 12:56:24] [INFO] [migration:99] Verifying migration... [2025-10-10 12:57:27] [INFO] [database:325] Connected to JSON backend [2025-10-10 12:57:27] [INFO] [database:539] Initialized JSON database backend [2025-10-10 12:57:27] [INFO] [migration:147] Initializing database system... [2025-10-10 12:57:27] [INFO] [migration:156] Starting migration process... [2025-10-10 12:57:27] [INFO] [migration:37] Migrated 2 user profiles [2025-10-10 12:57:27] [INFO] [migration:42] Backed up original file to src/user_profiles.json.backup.20251010_125727 [2025-10-10 12:57:27] [INFO] [migration:87] Migrated 0 conversation memories and 0 user memories [2025-10-10 12:57:27] [INFO] [migration:92] Backed up original file to src/memory.json.backup.20251010_125727 [2025-10-10 12:57:27] [INFO] [migration:99] Verifying migration... [2025-10-10 12:57:27] [INFO] [migration:107] βœ“ User profile operations working [2025-10-10 12:57:27] [INFO] [migration:118] βœ“ Memory operations working [2025-10-10 12:57:27] [INFO] [migration:138] βœ“ Migration verification completed successfully [2025-10-10 14:52:33] [INFO] [ai:69] πŸ” Modelfile loaded: ../examples/gojo.mod [2025-10-10 14:52:33] [INFO] [database:325] Connected to JSON backend [2025-10-10 14:52:33] [INFO] [database:541] Initialized JSON database backend [2025-10-10 14:52:33] [INFO] [bot:66] πŸ” Loaded MODEL_NAME from .env: gemma3:12b [2025-10-10 14:52:33] [INFO] [bot:68] 🧹 Attempting to clear VRAM before loading gemma3:12b... [2025-10-10 14:52:33] [INFO] [ai:182] 🧹 Sending safe unload request for `gemma3:12b` [2025-10-10 14:52:33] [INFO] [ai:189] 🧽 Ollama unload response: 200 - {"model":"gemma3:12b","created_at":"2025-10-10T18:52:33.307139942Z","response":"","done":true,"done_reason":"unload"} [2025-10-10 14:52:33] [INFO] [ai:162] 🧠 Preloading model: gemma3:12b [2025-10-10 14:52:33] [INFO] [ai:169] πŸ“¦ Model pull started successfully. [2025-10-10 14:52:33] [INFO] [bot:72] πŸš€ Model `gemma3:12b` preloaded on startup. [2025-10-10 14:52:33] [INFO] [bot:75] βœ… Final model in use: gemma3:12b [2025-10-10 14:52:36] [INFO] [bot:296] Logged in as AI Bot [2025-10-10 14:52:36] [INFO] [scheduler:29] πŸ›‘ Scheduler disabled in config. [2025-10-10 14:54:58] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.65) [2025-10-10 14:54:58] [DEBUG] [autochat:92] 🎲 Reaction skipped (chance 0.10, roll 0.16) [2025-10-10 14:54:58] [INFO] [autochat:161] 😴 No trigger and engagement is 0 β€” skipping. [2025-10-10 14:54:58] [INFO] [bot:273] ============================================================ AI Response ============================================================ [2025-10-10 14:54:58] [INFO] [bot:274] 🧠 Profile loaded for Miguel (interactions: 2) [2025-10-10 14:54:58] [INFO] [bot:278] πŸ“š Retrieved 0 messages for context [2025-10-10 14:54:58] [INFO] [ai:162] 🧠 Preloading model: gemma3:12b [2025-10-10 14:54:58] [INFO] [ai:169] πŸ“¦ Model pull started successfully. [2025-10-10 14:54:58] [INFO] [ai:132] llm-4a81439a LLM request start model=gemma3:12b user=Miguel context_len=0 [2025-10-10 14:54:58] [DEBUG] [ai:253] llm-4a81439a Sending payload to Ollama: model=gemma3:12b user=Miguel [2025-10-10 14:54:58] [DEBUG] [ai:254] llm-4a81439a Payload size=217 chars [2025-10-10 14:55:10] [DEBUG] [ai:262] llm-4a81439a Raw response status=200 [2025-10-10 14:55:10] [DEBUG] [ai:263] llm-4a81439a Raw response body={"model":"gemma3:12b","created_at":"2025-10-10T18:55:10.915350904Z","response":"Oh, you know, living the *best* life. Floating effortlessly, dazzling everyone with my good looks, and generally being a beacon of awesome. Seriously, it's exhausting being this incredible. πŸ˜‰ What about *you*? Don't tell me you're just standing there staring, you should be fanning yourself from all the brilliance.","done":true,"done_reason":"stop","context":[105,2364,107,236820,236909,9731,111038,3048,659,555,1277,236756,3764,5400,699,219769,208651,751,119814,236761,1599,236858,500,16690,236762,236764,8632,236764,532,1378,14620,236764,840,19297,20111,529,822,3272,236761,95419,528,496,4532,15737,236761,15428,19921,611,236789,500,496,9894,21603,236909,643,111038,2887,236787,1217,611,3490,236813,107,236820,236909,111457,111038,106,107,105,4368,107,12932,236764,611,1281,236764,4882,506,808,9783,236829,1972,236761,71721,80318,236764,88307,4677,607,1041,1535,5724,236764,532,6816,1646,496,70908,529,15353,236761,96612,236764,625,236789,236751,93646,1646,672,15269,236761,85345,2900,1003,808,7624,236829,236881,5185,236789,236745,3442,786,611,236789,500,1164,8101,993,47264,236764,611,1374,577,517,4601,5869,699,784,506,102141,236761],"total_duration":12103477338,"load_duration":10524951841,"prompt_eval_count":73,"prompt_eval_duration":555107252,"eval_count":72,"eval_duration":1022930871} [2025-10-10 14:55:10] [INFO] [ai:140] llm-4a81439a LLM response model=gemma3:12b duration=12.125s summary=Oh, you know, living the *best* life. Floating effortlessly, dazzling everyone with my good looks, and generally being a beacon of awesome. Seriously, it's exha [2025-10-10 14:55:10] [DEBUG] [ai:142] llm-4a81439a LLM raw response: {'model': 'gemma3:12b', 'created_at': '2025-10-10T18:55:10.915350904Z', 'response': "Oh, you know, living the *best* life. Floating effortlessly, dazzling everyone with my good looks, and generally being a beacon of awesome. Seriously, it's exhausting being this incredible. πŸ˜‰ What about *you*? Don't tell me you're just standing there staring, you should be fanning yourself from all the brilliance.", 'done': True, 'done_reason': 'stop', 'context': [105, 2364, 107, 236820, 236909, 9731, 111038, 3048, 659, 555, 1277, 236756, 3764, 5400, 699, 219769, 208651, 751, 119814, 236761, 1599, 236858, 500, 16690, 236762, 236764, 8632, 236764, 532, 1378, 14620, 236764, 840, 19297, 20111, 529, 822, 3272, 236761, 95419, 528, 496, 4532, 15737, 236761, 15428, 19921, 611, 236789, 500, 496, 9894, 21603, 236909, 643, 111038, 2887, 236787, 1217, 611, 3490, 236813, 107, 236820, 236909, 111457, 111038, 106, 107, 105, 4368, 107, 12932, 236764, 611, 1281, 236764, 4882, 506, 808, 9783, 236829, 1972, 236761, 71721, 80318, 236764, 88307, 4677, 607, 1041, 1535, 5724, 236764, 532, 6816, 1646, 496, 70908, 529, 15353, 236761, 96612, 236764, 625, 236789, 236751, 93646, 1646, 672, 15269, 236761, 85345, 2900, 1003, 808, 7624, 236829, 236881, 5185, 236789, 236745, 3442, 786, 611, 236789, 500, 1164, 8101, 993, 47264, 236764, 611, 1374, 577, 517, 4601, 5869, 699, 784, 506, 102141, 236761], 'total_duration': 12103477338, 'load_duration': 10524951841, 'prompt_eval_count': 73, 'prompt_eval_duration': 555107252, 'eval_count': 72, 'eval_duration': 1022930871} [2025-10-10 14:55:11] [DEBUG] [bot:244] on_message: observed own message id=1426281930004889680 channel=1380999713272238151 [2025-10-10 15:04:00] [INFO] [database:325] Connected to JSON backend [2025-10-10 15:04:00] [INFO] [database:541] Initialized JSON database backend [2025-10-10 15:04:00] [INFO] [webui:314] Starting web UI server on 0.0.0.0:8080 [2025-10-10 15:04:00] [INFO] [webui:315] ================================================== [2025-10-10 15:04:00] [INFO] [webui:316] 🌐 WEB UI ACCESS URLS: [2025-10-10 15:04:00] [INFO] [webui:317] Local: http://localhost:8080 [2025-10-10 15:04:00] [INFO] [webui:318] Network: http://192.168.0.144:8080 [2025-10-10 15:04:00] [INFO] [webui:319] ================================================== [2025-10-10 15:04:46] [INFO] [database:325] Connected to JSON backend [2025-10-10 15:04:46] [INFO] [database:541] Initialized JSON database backend [2025-10-10 15:04:46] [INFO] [webui:314] Starting web UI server on 0.0.0.0:8081 [2025-10-10 15:04:46] [INFO] [webui:315] ================================================== [2025-10-10 15:04:46] [INFO] [webui:316] 🌐 WEB UI ACCESS URLS: [2025-10-10 15:04:46] [INFO] [webui:317] Local: http://localhost:8081 [2025-10-10 15:04:46] [INFO] [webui:318] Network: http://192.168.0.144:8081 [2025-10-10 15:04:46] [INFO] [webui:319] ==================================================