AI國際鏈結辦公室於11/22(四)舉辦一場AI專題演講!
演講詳細資訊如下,歡迎大家踴躍參加!
#題目: AI Chip Design Challenges at the Edge – from Deep Learning Model to Hardware.
#講者: Bike Xie, Director of engineering, Kneron Inc
#時間: 2018/11/22(四)10:00-11:20AM
#地點: 國立清華大學 台達館106室
#報名及詳細資訊: https://reurl.cc/yEv58
#演講摘要: Since the remarkable success of AlexNet on the 2012 ImageNet competition, deep learning models and especially CNN models have become the architecture of the choice for many computer vision tasks. However, inference of a CNN model can be highly computational expensive, especially for the end-user devices, such as the internet of thing (IoT) devices, which have a very limited computing capability with low-precision arithmetic operators. A typical CNN model might require billions of multiply-accumulate operations (MACs), load millions of weights, and draw several watts power for a single inference. Limited computing resources and storage become the major obstacle to run computation-hungry CNN on IoT devices.
Many design techniques in the area of model structure, compiler, and hardware architecture are making it possible to deploy CNN models on edge devices. This report discusses the design challenges for AI chip at the edge and briefly introduces these design techniques. A well-designed small size model might only require much less storage and computation resource. Therefore, model compression techniques including pruning, quantization, model distillation become substantial to deploy CNN models on edge devices. Compiling CNN models to hardware instructions is another critical step. operation fusion, partition, and ordering might significantly improve the memory efficiency and model inference speed. Finally, hardware architecture for AI chip is currently one of the hottest topics in circuit design. Dedicated AI accelerators provide an opportunity to optimize the data movement in order to minimize memory access and maximize MAC efficiency.
主辦:國立清華大學AI創新研究中心專案-國際鏈結計畫
聯絡資訊:
田小姐 03-5715131 分機34908
黃小姐 03-5715131 分機34905
同時也有10000部Youtube影片,追蹤數超過2,910的網紅コバにゃんチャンネル,也在其Youtube影片中提到,...
「model quantization」的推薦目錄:
- 關於model quantization 在 國立陽明交通大學電子工程學系及電子研究所 Facebook 的精選貼文
- 關於model quantization 在 コバにゃんチャンネル Youtube 的最佳解答
- 關於model quantization 在 大象中醫 Youtube 的最佳解答
- 關於model quantization 在 大象中醫 Youtube 的最佳解答
- 關於model quantization 在 htqin/awesome-model-quantization - GitHub 的評價
- 關於model quantization 在 Quantization - Neural Network Distiller 的評價
- 關於model quantization 在 Int8 quantization of a LSTM model. No matter which version, I ... 的評價
- 關於model quantization 在 (experimental) Dynamic Quantization on BERT.ipynb 的評價
- 關於model quantization 在 Training with quantization noise for extreme model compression 的評價
model quantization 在 コバにゃんチャンネル Youtube 的最佳解答
model quantization 在 大象中醫 Youtube 的最佳解答
model quantization 在 大象中醫 Youtube 的最佳解答
model quantization 在 Quantization - Neural Network Distiller 的推薦與評價
However, the desire for reduced bandwidth and compute requirements of deep learning models has driven research into using lower-precision numerical formats. It ... ... <看更多>
model quantization 在 Int8 quantization of a LSTM model. No matter which version, I ... 的推薦與評價
... <看更多>
model quantization 在 htqin/awesome-model-quantization - GitHub 的推薦與評價
A list of papers, docs, codes about model quantization. This repo is aimed to provide the info for model quantization research, we are continuously ... ... <看更多>