Plant Disease Classification using Lite Pretrained Deep Convolutional Neural Network on Android Mobile Device
Burhanudin Syamsuri1, Gede Putra Kusuma2

1Burhanudin Syamsuri*, Computer Science Department, BINUS Graduate Program – Master of Computer Science, Bina Nusantara University, Jakarta, Indonesia.
2Gede Putra Kusuma, Computer Science Department, BINUS Graduate Program, Master of Computer Science, Bina Nusantara University, Jakarta, Indonesia.

Manuscript received on November 14, 2019. | Revised Manuscript received on 23 November, 2019. | Manuscript published on December 10, 2019. | PP: 2796-2804 | Volume-9 Issue-2, December 2019. | Retrieval Number: B6647129219/2019©BEIESP | DOI: 10.35940/ijitee.B6647.129219
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The implementation of image recognition in agriculture to detect symptoms of plant disease using deep learning Convolutional Neural Network (CNN) models are proven to be highly effective. The computational efficiency by using CNN, made possible to run the application on mobile device. To optimize the utilization of mobile device and choosing the most effective CNN model to run as detection system in mobile device with the highest accuracy and low resource consumption is proposed in this paper. In this study, PlantVillage dataset which extended to coffee leaf, were tested and compared using three CNN models, two models which specifically designed for mobile, MobileNet and Mobile Nasnet (MNasNet), and one model that recognized for its accuracy on personal computer (PC), InceptionV3. The experiment executed on both mobile and PC found a slightly degradation on accuracy when the application is running on mobile. InceptionV3 experienced the most persistence model compares to MNasNet and MobileNet. Yet, InceptionV3 had biggest latency time. The final result on mobile device recorded InceptionV3 achieved highest accuracy of 95.79%, MNasNet 94.87%, and MobileNet 92.83%, while for time latency MobileNet achieved the lowest with 394.70 ms, MNasnet 430.20 ms, and InceptionV3 2236.10 ms respectively. It is expected that the outcome of this study will be of great benefit to farmers as mobile image recognition would help them analyze the condition of their plants on site simply by taking a picture of the leaf and running the experiment on their mobile device. 
Keywords: Deep Learning, CNN, Pretrained Model, TF Lite, Mobile Application.
Scope of the Article: Deep Learning