Revised Information (After Feedback)
Stages of Implementation
The following step-by-step approach to launching CropWise shows that although our idea may be complex, it is very feasible in the long run. We split this into different stages and we will reveal each feature one-by-one so that it becomes easier for testing and finding errors.
1. We will manually test the targeted nozzle and horizontal spraying method on various crops. This will be a laboratory experiment where the level of pesticides in the soil, plant, and fruit are measured. We do this to control the variables and prove that our spraying method actually works.
2. We will ask for volunteer farms from Central Thailand (mainly grasslands) who have expressed interest in implementing this new product into their farm. An appropriate sample size of farms representing 5 categories will be chosen: durians, mangosteen, longans, mango, and pineapple. These are the top fruits grown. In this stage, the effectiveness of different crops will be measured.
3. From results in stage 2, we will choose the fruit that demonstrated the highest effectiveness of lessening pesticides and keeping the fruit quality. Then, we will attach this nozzle and a pesticide tank to a drone to see if it flies. This is the Minimum Viable Product (MVP) that we will be presented to the public. We most definitely have to refine the drone system to make it aerodynamically suitable.
4. The MVP from stage 3 will be marketed to call for customers who want to switch to a targeted drone spraying method. The product that we will give them is essentially a drone with a nozzle that is manually controlled from a remote. We will limit the product to 20 farms, and it will be completely free for everyone in the first 3 months. Each of our engineers will be assigned around 4 farms to look after. During this time, we will also put a camera on the drone and record videos. These videos will be used to train our AI image recognition model with great precision from the diverse range of farms we have.
5. After training our AI, we will then test the AI concurrently with developing the mobile application for users. So, we will have the drone with a camera fly out and instead of just taking a video, it will continuously send data to a server that processes the raw data and turns it into usable data. Appropriate changes to our methods will be done to address issues (eg. shaky camera, unclear images). When we deem that the number of insects is appropriate from the AI model corresponds well with the images in various scenarios, we will launch the application for farmers to see the number of insects per grid. This is when the revenue will start coming in because more farms pay to use our product.
6. The AI training process will not stop there. In fact, we will continuously train the AI and implement random quality checks for different farms to improve accuracy. If the image recognition AI becomes stable, we will proceed with the thermal camera. The thermal camera is already a built-in feature of our regular camera that we will buy (from researching existing cameras). The process of training the thermal camera is essentially the same as the pure camera image recognition. As we already have many farms, we will just turn on the setting for the thermal camera and collect data.
7. We will then scale to farms in the north of Thailand, a large hub for farming. Due to its topography as a mountainous region, the north has common insects that make loud unique noises. In this stage, we will test and implement a bio-acoustic sensor that utilizes AI to determine the number of insects in the grid by using the intensity of sound and the patterns of noise. This is feasible because we've found pre-trained models that not only determine the number of insects but even the type of insects in Southeast Asian mountains.
8. The final feature that we will reveal is the autonomous drone. In past stages, everything was controlled manually apart from the AI that collects and analyzes the data. Now, we will test fly drones that will fly out to spray crops. We can employ our existing image detection model to help train this autonomous drone. First, we will train the drone to navigate fields. Then, we will train the camera to identify crops that must be sprayed because our targeted horizontal spray method requires the drone to spray only the body of the crops. Lastly, we will put in the nozzle and make it learn to effectively position itself and spray in an efficient and energy-safe manner.
9. We hope to expand to all parts of Thailand in this stage. After gaining popularity, we might also scale to other south-east Asian countries, such as Myanmar, Laos, and Cambodia who all have similar topography and insects. Our selling point in business is the data that we've collected which makes our system accurate.
Feasibility and Scalability
Many of our features are already existing, such as the thermal camera or the bio-acoustic sensor. However, our uniqueness lies in the combination of features that work as a whole to eliminate pesticide effects on the environment and consumer/farmer health. Apart from creating new environmentally friendly pesticides that may sometimes be difficult to use, expensive, or unfeasible for farmers, no other project has ever tackled this issue in a similar way before. We take a technological based approach and look after our customers in every step to help improve their knowledge of agricultural innovations. This, and the expected decrease in long-run costs for farmers, incentivizes them to use our solution.
Original Answers
Inspiration and problem identification
CropWise was inspired by the inefficiency and environmental impact of the conventional pesticides use. The problem identified was the low effectiveness of the pesticide, leading to numerous health risks and environmental damages.
Solution and Features
CropWise introduces a data-driven solution using drone and sensor technology for precise pesticide application specifically to the area that is necessary. Key features include data collection through drone flyovers, bio-acoustic sensors aimed to approximate insect density, and a specially designed nozzle for accurate pesticide print based on collected data.
Any further potential problems that might occur when implementing it in the real world
Farmers may resist new technologies or might face challenges integrating them into existing practices. We solve this by using a full-service model where we control the costs and equipment for farmers to lessen the burden on them. The drone could also malfunction, have sensor inaccuracies, or possible software glitches, which could result in hindering the effectiveness of the system. We will continue to monitor software inefficiencies. As we collect more data, the accuracy of detection will increase.
Finances and business model of your project
Costs were allocated into four different areas: 25% for research and development, 10% for marketing, 30% for Human Resources and 20% for software system. Also, we adopt B2C2B business model, meaning that revenue could come from selling the technology to organic and commercial farmers, agricultural consultants and government agricultural departments. Potentially, the additional income could be derived from data services or subscriptions.
Future places your project could be taken with additional work
We can potentially expand our services into other regions while also developing various options of technology for different types of crops and farming practices. We will continue to improve drone and sensor technology along with integrating machine learning for more accurate pest prediction. We can do partnerships or collaborations with government agencies and research centers for continuous innovation. In addition, we can minimize the environmental impact by incorporating more eco-friendly pesticides.
Built With
- canva
- photoshop
Log in or sign up for Devpost to join the conversation.