Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot evaluate code inferenced with LLama3.1 Instruct 8B #73

Open
wyettzeng opened this issue Feb 2, 2025 · 2 comments
Open

Cannot evaluate code inferenced with LLama3.1 Instruct 8B #73

wyettzeng opened this issue Feb 2, 2025 · 2 comments

Comments

@wyettzeng
Copy link

I am doing best of 64 evaluations with LLama 3.1 Instruct 8B. However, whenever I try to evaluate the inference output the program always output "Terminated" after about 1/4th of the way (without me doing anything). Here is my inferences:

https://drive.google.com/file/d/1A9aOw-xy3p0bF5oJF7Eh_JgFEKGIAiwS/view?usp=sharing

Image
@terryyz
Copy link
Collaborator

terryyz commented Feb 2, 2025

I think this happens due to resource exhaustion, as we use concurrent execution. Some output executions may require a lot of resources and result in unexpected termination. By default, we use all the CPUs on the machine. You may find a better machine for execution or limit the number of works (via parallel).

@wyettzeng
Copy link
Author

got it, let me try again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants