When i choose model llama3.2:3b, it tries llama3.2:latest and there is an error in terminal but blank in the localhost.