Hi,
I have built a custom model “mynet_int8_fuse.cvimodel”. Now I want to inference this model on Duo256 with TDK. All samples and API in TDL SDK are specific to a model architecture. How to do inference a custom model?
Thanks
Surendra
2 Likes
Hi Surendra,
I’ve Implemented the TDL_custom_ methods from the documentation but as it does not fails it does not work either and I only get empty output tensors. No clue on how to debug all of this.
Have you got any luck on running a custom model on the Duo?
Cheers from France!
Clem
1 Like
Hi Community,
Actually for my custom cvimodel inference I’ve found the cvi_tdl_custom.h
which helped me a lot but I was also looking on how the CVI_TDL_Custom_*
part of the CVI api was implemented (cpp). Unfortunately I can’t get my hands on it. It does not appears to be part of the tdl_sdk
from milkv duo-buildroot-sdk-v2.
Am I blind ?
Can someone point me out where is this source implemented?
Thanks in advance,
Clem
1 Like