<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Purchased the Axera AX8850 and having problems]]></title><description><![CDATA[<p dir="auto">Hello,<br />
I just purchased the Axera AX8850 and am having problems running the model Qwen3-1.7B in the directions on the M5 website. when run I receive the following errors:</p>
<p dir="auto">[json.exception.parse_error.101] parse error at line 1, column 1: attempting to parse an empty input; check that your input string or stream contains the expected JSON<br />
[E][ Init][ 65]: get uid failed, try again 1/10<br />
[json.exception.parse_error.101] parse error at line 1, column 1: attempting to parse an empty input; check that your input string or stream contains the expected JSON<br />
[E][ Init][ 65]: get uid failed, try again 0/10<br />
[json.exception.type_error.302] type must be number, but is null<br />
[E][ Init][ 89]: get bos_id failed, try again 9/10<br />
[json.exception.type_error.302] type must be number, but is null<br />
[E][ Init][ 89]: get bos_id failed, try again 8/10<br />
[json.exception.type_error.302] type must be number, but is null<br />
[E][ Init][ 89]: get bos_id failed, try again 7/10</p>
<p dir="auto">At the end of the error I get the following:<br />
bos_id: 0, eos_id: 0<br />
terminate called after throwing an instance of 'nlohmann::json_abi_v3_11_3::detail::type_error'<br />
what(): [json.exception.type_error.302] type must be array, but is null<br />
./run_qwen3_1.7b_int8_ctx_axcl_aarch64.sh: line 12: 1716 Aborted ./main_axcl_aarch64 --system_prompt "You are Qwen, created by Alibaba Cloud. You are a helpful assistant." --template_filename_axmodel "qwen3-1.7b-ax650/qwen3_p128_l%d_together.axmodel" --axmodel_num 28 --url_tokenizer_model "<a href="http://127.0.0.1:12345" target="_blank" rel="noopener noreferrer nofollow ugc">http://127.0.0.1:12345</a>" --filename_post_axmodel qwen3-1.7b-ax650/qwen3_post.axmodel --filename_tokens_embed qwen3-1.7b-ax650/model.embed_tokens.weight.bfloat16.bin --tokens_embed_num 151936 --tokens_embed_size 2048 --use_mmap_load_embed 1 --live_print 1 --devices 0</p>
<p dir="auto">It looks like the tokenizer is running but I can't get the model running. I have tried several models from hugging's face and receive the same error. I am using a pi 5 with 16gb memory. Could you please let me know how to fix?</p>
<p dir="auto">Thank you,<br />
Mark</p>
]]></description><link>https://community.m5stack.com/topic/7999/purchased-the-axera-ax8850-and-having-problems</link><generator>RSS for Node</generator><lastBuildDate>Wed, 11 Mar 2026 03:58:00 GMT</lastBuildDate><atom:link href="https://community.m5stack.com/topic/7999.rss" rel="self" type="application/rss+xml"/><pubDate>Thu, 08 Jan 2026 13:04:43 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to Purchased the Axera AX8850 and having problems on Sat, 10 Jan 2026 21:51:55 GMT]]></title><description><![CDATA[<p dir="auto">Never mind, this is a model problem, I downloaded another model from hugging's face and it worked. Most of the models I tried from there do not work, I only got two of them to process. It was qwen2.5-7B and the Deepseek one. Both were a little slow, so I compared them using the CPU and the AX8850 and the CPU was slower, but not by much, This is on a pi 5 16GB board.<br />
I also have a Kinara ara-2 to work on with 16GB memory and probably would be faster, but I am having problems converting models to its format. I got the pi 5 to recognize it via driver, I just need a converted model to try it. I also have a Hailo-10H coming, can't wait to test it for a project, its supposed to work better with Home Assistant.</p>
]]></description><link>https://community.m5stack.com/post/30433</link><guid isPermaLink="true">https://community.m5stack.com/post/30433</guid><dc:creator><![CDATA[mpro77]]></dc:creator><pubDate>Sat, 10 Jan 2026 21:51:55 GMT</pubDate></item></channel></rss>