Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"list index out of range" error #121

Open
neowisard opened this issue Dec 14, 2023 · 2 comments
Open

"list index out of range" error #121

neowisard opened this issue Dec 14, 2023 · 2 comments

Comments

@neowisard
Copy link

neowisard commented Dec 14, 2023

Hi and thank you very much !
I added an admin ID and write to the bot directly - I get answers, everything is fine. ChatML preset.
Then I write to the bot directly from another user who is not in the list of admins and telegram_users.txt is empty, then the bot starts responding ("X typing" during generating), but I don't see any response.
The script gives the error ERROR:root:thread_get_messagelist index out of range('list index out of range',) and continues to work.

Where could I make a mistake?

Or is it necessary to use only group chat for different users ?

full log
Dec 16 19:38:22 openchat run.sh[5771]: INFO:root:### TelegramBotWrapper INIT DONE ###
Dec 16 19:38:22 openchat run.sh[5771]: INFO:root:### !!! READY !!! ###
Dec 16 19:38:22 openchat run.sh[5771]: INFO:aiogram.dispatcher.dispatcher:Start polling.
Dec 16 19:39:30 openchat run.sh[5771]: llama_print_timings: load time = 21282.28 ms
Dec 16 19:39:30 openchat run.sh[5771]: llama_print_timings: sample time = 211.07 ms / 86 runs ( 2.45 ms per token, 407.44 tokens per second)
Dec 16 19:39:30 openchat run.sh[5771]: llama_print_timings: prompt eval time = 46708.79 ms / 1087 tokens ( 42.97 ms per token, 23.27 tokens per second)
Dec 16 19:39:30 openchat run.sh[5771]: llama_print_timings: eval time = 20697.02 ms / 85 runs ( 243.49 ms per token, 4.11 tokens per second)
Dec 16 19:39:30 openchat run.sh[5771]: llama_print_timings: total time = 67857.71 ms
Dec 16 19:39:30 openchat run.sh[5771]: Llama.generate: prefix-match hit
Dec 16 19:39:33 openchat run.sh[5771]: llama_print_timings: load time = 21282.28 ms
Dec 16 19:39:33 openchat run.sh[5771]: llama_print_timings: sample time = 29.15 ms / 12 runs ( 2.43 ms per token, 411.69 tokens per second)
Dec 16 19:39:33 openchat run.sh[5771]: llama_print_timings: prompt eval time = 1117.42 ms / 32 tokens ( 34.92 ms per token, 28.64 tokens per second)
Dec 16 19:39:33 openchat run.sh[5771]: llama_print_timings: eval time = 1610.27 ms / 11 runs ( 146.39 ms per token, 6.83 tokens per second)
Dec 16 19:39:33 openchat run.sh[5771]: llama_print_timings: total time = 2788.88 ms
Dec 16 19:39:34 openchat run.sh[5771]: ERROR:root:thread_get_messagelist index out of range('list index out of range',)
Dec 16 19:39:51 openchat run.sh[5771]: Llama.generate: prefix-match hit
Dec 16 19:39:59 openchat run.sh[5771]: llama_print_timings: load time = 21282.28 ms
Dec 16 19:39:59 openchat run.sh[5771]: llama_print_timings: sample time = 86.89 ms / 36 runs ( 2.41 ms per token, 414.30 tokens per second)
Dec 16 19:39:59 openchat run.sh[5771]: llama_print_timings: prompt eval time = 2698.25 ms / 24 tokens ( 112.43 ms per token, 8.89 tokens per second)
Dec 16 19:39:59 openchat run.sh[5771]: llama_print_timings: eval time = 5428.98 ms / 35 runs ( 155.11 ms per token, 6.45 tokens per second)
Dec 16 19:39:59 openchat run.sh[5771]: llama_print_timings: total time = 8316.26 ms
Dec 16 19:40:00 openchat run.sh[5771]: ERROR:root:thread_get_messagelist index out of range('list index out of range',)

@neowisard
Copy link
Author

neowisard commented Dec 18, 2023

if block exeption in thread_get_message :

ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-7' coro=<Dispatcher._process_polling_updates() done, defined at /usr/local/lib/python3.10/dist-packages/aiogram/dispatcher/dispatcher.py:410> exception=IndexError('list index out of range')>
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/aiogram/dispatcher/dispatcher.py", line 418, in _process_polling_updates
for responses in itertools.chain.from_iterable(await self.process_updates(updates, fast)):
File "/usr/local/lib/python3.10/dist-packages/aiogram/dispatcher/dispatcher.py", line 236, in process_updates
return await asyncio.gather(*tasks)
File "/usr/local/lib/python3.10/dist-packages/aiogram/dispatcher/handler.py", line 117, in notify
response = await handler_obj.handler(*args, **partial_data)
File "/usr/local/lib/python3.10/dist-packages/aiogram/dispatcher/dispatcher.py", line 257, in process_update
return await self.message_handlers.notify(update.message)
File "/usr/local/lib/python3.10/dist-packages/aiogram/dispatcher/handler.py", line 117, in notify
response = await handler_obj.handler(*args, **partial_data)
File "/data/llm_telegram_bot/main.py", line 361, in thread_get_message
reply = await self.send_message(text=answer, chat_id=chat_id)
File "/usr/local/lib/python3.10/dist-packages/backoff/_async.py", line 151, in retry
ret = await target(*args, **kwargs)
File "/data/llm_telegram_bot/main.py", line 230, in send_message
text = await utils.prepare_text(text, user, "to_user")
File "/data/llm_telegram_bot/source/utils.py", line 67, in prepare_text
+ cfg.translate_html_tag[0]
IndexError: list index out of range
<|im_start|>system

@neowisard
Copy link
Author

There some error in file "/llm telegram_bot/source/utils.py", line 67, and i 've been delete code of function prepare_text , because llm can chat without translate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant