Food production and preparation have always been labor and capital intensive, but with the internet of things, low-cost sensors, cloud-computing ubiquity, and big data analysis, farmers and chefs are being replaced with connected, big data robots—not just in the field but also in your kitchen. Tim Gasper explores the tech stack, data science techniques, and use cases driving this revolution.
As presented in 2017 at O'Reilly: Strata + Hadoop World Conference and Data Day Texas.
17. https://arxiv.org/abs/1604.03169
1. Choice of deep learning architecture
• AlexNet
• GoogleNet
2. Choice of training mechanism
• Transfer Learning
• Training from Scratch
3. Choice of dataset type
• Color
• Gray scale
• Leaf Segmented
4. Choice of training-testing set distribution
• Train: 80%, Test: 20%
• Train: 60%, Test: 40%
• Train: 50%, Test: 50%
• Solver type: Stochastic Gradient
Descent
• Base learning rate: 0.005
• Learning rate policy: Step (decreases
by a factor of 10 every 30/3 epochs)
• Momentum: 0.9
• Weight decay: 0.0005
• Gamma: 0.1
• Batch size: 24 (in case of GoogleNet),
100 (in case of AlexNet)
http://adigaskell.org/2015/08/21/texas-am-develop-the-next-generation-of-farming-drones/
The sensors being used typically include ultrasound devices to measure plant height, infrared thermometers to measure the temperature of the soil and of plants, and also hyperspectral sensors to measure the water content of leaves. They are also considering sensors to detect the normalized difference vegetation index (or photosynthesis rate to you and me).
In the wine country of France, the Wall-Ye 1000 can often be seen pruning rows of grapes.
https://arxiv.org/abs/1604.03169
CAFFE
Using a public dataset of 54,306 images of diseased and healthy plant leaves collected under controlled conditions, we train a deep convolutional neural network to identify 14 crop species and 26 diseases (or absence thereof). The trained model achieves an accuracy of 99.35% on a held-out test set, demonstrating the feasibility of this approach. When testing the model on a set of images collected from trusted online sources - i.e. taken under conditions different from the images used for training - the model still achieves an accuracy of 31.4%.
https://qz.com/295936/toshibas-high-tech-grow-rooms-are-churning-out-lettuce-that-never-needs-washing/
HYDROPONICS
FEED ALL OF JAPAN
http://gogreenagriculture.com/Home/
Happy Living Lettuce and other products
http://modernfarmer.com/2015/10/worlds-first-robot-farm/
DOESN”T REQUIRE ANY PEOPLE AT ALL
Japanese company called SPREAD (only step they are struggling to automate is the transplanting of the seedling when at proper germination level to long term growing bed but they are making good progress there.
https://www.youtube.com/watch?v=uI7aHhy8tyU
FARMBOT – OPEN SOURCE PROJECT
https://github.com/farmbot
Casabots
Deepak Sekar
Moley Robotics
The robotic hands, which are notoriously difficult to create, use 20 motors, 24 joints and 129 sensors to create the same range of movements that a human hand can make.