ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article
Revised

Solar-powered motion sensor for farm monitoring and surveillance: a solar-powered assisted process

[version 3; peer review: 2 approved with reservations]
Previously titled: Proposed Solar-Powered Motion Sensor for Farm Monitoring and Surveillance: A Solar-Powered Assisted Process
PUBLISHED 04 Nov 2025
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

This article is included in the Agriculture, Food and Nutrition gateway.

Abstract

The farming industry faces continuous threats from pest control and farm security issues because rodents cause significant damage to crops and disrupt farm operations. Traditional pest control methods require continuous human interaction which proves both resource-intensive and inefficient. Modern agricultural practices benefit from sustainable solutions through the combination of renewable energy with smart technologies. This research presents an innovative solar-powered motion-sensor system that utilizes OpenCV and YOLOv8 computer vision frameworks to autonomously detect and classify rodent intruders on farmland real-time. The system demonstrates its ability to detect and prevent rodent intruders according to initial testing results. The OpenCV/YOLO system uses motion sensor signals to analyze movement patterns before distinguishing rodents in their various groups. The solar-powered system operates continuously which decreases human intervention needs and enhances farm surveillance capabilities. The model demonstrates its capability to defend crops from rodent damage and enhance farm resistance against land degradation threats, resulting to improved crop yield and management. KPIs have been evaluable to prove the system’s efficiency and reliable to be used in agricultural practices to manage animal pests. The current system encounters problems with detecting wild animals beyond rodents as well as tracking rodent activity beneath ground level. Future developments could include improved pest capture systems alongside enhanced surveillance features for detecting both unauthorized human intruders and large animals. Future research should also invest in real-time implementation in fields with real data. The automated monitoring technology needs to be integrated with reliable sources of energy to create sustainable agricultural operations that are efficient and resilient, hence enhancing food security.

Keywords

Solar-powered farm monitoring, rodent detection, automated surveillance, image recognition, sustainable agriculture

Revised Amendments from Version 2

In Version 3,
The title is unchanged for V3.
abstract is updated to tightened to emphasize the demonstration-scale scope, the combined use of motion-sensor gating with YOLOv8 on a solar edge device, and explicit limitations (small dataset; preliminary field validation) for V3.
The author list is unchanged for V3.
Methods: Added step-by-step details on data collection, labeling and augmentation, motion-trigger logic, and YOLOv8 training/inference settings; included a tempered discussion of observed failure modes for V3.
Figures—Added/Updated for V3:

  • End-to-end system architecture.
  • Motion-gated detection pipeline schematic.
  • Four KPI visualizations: Detection Per Frame, Confidence Distribution, Animal Detection Distribution, and Frames Per Second Over Time.
Conclusion is updated for V3 to Strengthen with distributional views (confidence scores, class counts) and temporal performance (FPS over time), with explicit caveats regarding small-sample constraints

To read any peer review reports and author responses for this article, follow the "read" links in the Open Peer Review table.

1. Introduction

Motion sensors are devices used to detect objects that have moved within a given area. These sensors act as though they are self-activated, giving an alarm or engaging in their pre-programmed activity upon the detection of motion. In the agricultural domain, planning the process of motion monitoring constitutes one of the most important steps toward protecting the farming business as well as increasing its efficiency. Today, farm monitoring and surveillance still involve manual driving around the farm or the installation of security cameras at vantage points within the farm. However, these systems come with pitfalls, such as the inability to generate comprehensive lists, having low rates of penetration, and requiring frequent manual monitoring. Also, they effectively depend on conventional energy sources, which can be expensive and practically destructive to the natural environment. Emmanuel et al. (2018) established that by using a wireless sensor network differently from the usual method of erecting fences with sticks and ropes, farms could be monitored and controlled more effectively and sustainably. In addition to triggering the lighting, a wireless sensor network can also activate an alarm and send an SMS or an app notification to the farmer to take necessary action. Sensors also send an SMS or an app notification to the farmer to take necessary action (Emmanuel et al., 2018).

1.1 Research gap

The current solution for using motion sensor technologies and implementing them in farm monitoring systems is also not without several drawbacks that hamper their efficacy, especially for the large-scale and remote environments of the farming sector. Almost all the conventional motion sensors and CCTV cameras are specially designed to detect human movement or larger animal intrusions but are not efficient at detecting smaller vertebrate pests like rodents that are well known to cause massive losses through crop and stored produce raiding. This consequently results in huge losses since rodent infestations are normally undetected until extensive damage has been done. Moreover, the current surveillance systems require a constant supply of electrical power, which is not available in most of the rural and developed central region areas.

Another problem in current motion sensor systems is that they are not real-time responsive, or at least not fully automated. Most typical systems employ the human operator to watch monitors for the footage or to respond to alerts, which raises a great deal of labor costs and hampers the efficiency of farm surveillance. All these systems are post-incident systems, which implies that these systems work following an incident, not to prevent it. Also, the utilization of these energy resources is costly and raises the environmental costs of operation, thereby making these systems unsustainable in their current utilization.

Hence, the research question lies in finding a solution that would not only enhance the detection accuracy of relatively small targets like rodents but also employ sustainable forms of power in the process. The resulting gaps can be filled by the proposed solar-powered motion sensor system, which is capable of using renewable energy, up-to-date sensors, automation, and systems to monitor both large and small intrusions more effectively in terms of energy consumption.

1.2 Research aim and objectives

The primary objective of this proposed study is to develop a solar-powered motion sensor system capable of recognizing rodent invasion on farms within the monitored area, thereby enhancing efficient monitoring and surveillance efforts while consuming energy efficiently. The proposed model uses clean energy, thus a strong contribution to sustainable energy solution and environment conservation practice that avoids degradation, clear view of the farm and no pollution.

1.3 Research question

The proposed study is guided by the following research question:

How can a solar-powered motion sensor system utilizing computer vision and deep learning techniques be developed and implemented to accurately detect and classify animals on farms to enhance efficient farm monitoring and surveillance?

1.4 Research novelty

This study proposes a new concept of farm monitoring and surveillance through the use of solar-powered motion sensors coupled with computer vision and deep learning algorithms for rodent invasion detection. Compared to conventional farm surveillance systems, which are primarily meant to monitor human interference or large animal movement and require continuous electrical power, the proposed system is intended to be energy-autonomous and is to be powered through solar energy. This not only provides for the need to power such devices in areas where there is no physical access to an energy grid but also provides a positive impact on worldwide sustainability as the need for fossil fuels lessens.

Furthermore, through the integrated feature of computer vision technology, it also has the possibility to differentiate between varying cycles of motion, for instance, to differentiate the wind-blown leaves with rodent movements. The use of deep learning in the algorithms helps the system to update the information that is collected, enhancing detection results in the future. This adaptive capability greatly improves the chances of the system to detect the smaller pests that are ignored in most traditional systems. Further, notification through a mobile application that reflects the actual situation in the field enables farmers to act instantly and avoid further harm that may require manual intervention.

This study’s uniqueness is found in the integration of renewable energy sources, advanced detection systems, and automation in the monitoring of farms. This approach has the possibility of radically changing the practice of monitoring farms, providing one distinctive tool for the farmers, especially those who are located in areas where they have no access to the normal electrical power source.

2. Literature review

The use of sensors has been vastly studied for the protection of agricultural farms and for detecting possible intruders instantly. Mrunal Khedkar (2021) proposed a system that employs wireless sensor nodes in conjunction with passive infrared (PIR) sensors and low-power digital cameras that can detect motion and take pictures of intruders. The PIR sensors, universal in motion detection, are sensitive to variations in infrared emissions that warm objects produce. On detection, a low-power camera, including the OpenMV Cam, is actuated to take pictures or record the video. These visual data are transmitted wirelessly through protocols that include ZigBee or Bluetooth, whereby long-range protocols, such as the ZigBee, are appropriate for large areas of the farm, and short-range protocols, such as the Bluetooth. The base station collects the images and analyses them to capture the intruder; the quality of images is measured using peak signal-to-noise ratio (PSNR) and mean squared error (MSE).

Nevertheless, this system is a good example of PIR sensors and low-power cameras used in farm monitoring, though the whole system remains dependent on external power supplies. Its use is slightly effective in remote farm areas where power availability might be quite a challenge. Furthermore, using conventional battery-based devices for the wireless sensor nodes may not extend for long hours of performance without requiring frequent charging or replacement of batteries with fresh ones.

Sowmika, Rohith Paul, and Malathi (2020) designed an Internet of Things (IoT)-based rodent detection system using a PIR sensor for detecting the rodent. This system is triggered by sensing infrared radiation emitted from the body of the rats and sends an alert to a cloud-based platform. The PIR sensor for the rodent location works up to a 10-meter range, and the mode of operation can be live with instant alerts to the farmer via mobile applications. This system gives real-time detection and notification through wireless means, but it has a limited range and, more critically, it derives its power from conventional sources, thus a constraint in rural farming environments not connected to the grid.

A low-power bait station monitoring system for rodent detection was developed by Ross et al. (2020), known as RatSpy. RatSpy employs a combination of sensors and cameras to detect rats’ movement and the uptake of baits without manual inspection. The results are sent wirelessly to pest control operators, enabling those operators to monitor the status remotely. Owing to technology, the labor expenses associated with monitoring the bait stations’ condition and the general rodent population have greatly decreased, and the best feature of this solution is that it always actively scans for rodents. Nevertheless, the system has consistently lower power consumption to accomplish its task, which realistically necessitates battery change, making it impractical for large-scale farming or areas where a simple battery replacement is not easily accessible.

Lai et al. (2023) offered an ingenious approach using IoT nodes with Long Range (LoRa) modules to look after the farm. These nodes are fitted with PIR sensors and an ESP32 camera for image capture in real time. The system provided with a terminal transmits the data on the rodent activity wirelessly and stores it at a cloud server to processing. The flexibility offered by LoRa technology of transmitting information from one IoT node to another and to the cloud server is well applicable to large farms. However, there is the same problem known for other vision-based systems: energy consumption is high due to the continuous work of cameras and sensors that need a stable power supply. Implementation of the deep learning algorithms in these systems was likely to provide more precise detection of the rodents. Nevertheless, the utilization of deep learning models entails higher computational requirements that may further exponentially drain the energy supply, requiring a more efficient power source.

Cambra et al. (2017) proposed a low-power WSN system that employs a multi-hop wireless mesh network design for detecting rodent pests in agricultural fields. The system comprises the network coordinator, parent nodes – routers, and child nodes – sensor nodes. The PIR motion sensors placed on the hardware of the sensor nodes detect the motion and relay the information received through the mesh network to the network coordinator. The data transmission is made utilizing nRF24L01+ 2.4 GHz RF Transceivers, which consume negligible power. Besides, the system utilizes a number of power management features, including power down, interrupt, and sleep, which all help to further prolong battery lifespan. Though this system provides up to 90% saving of energy use, it comes from the conventional battery power that needs to be recharged. However, the sensitivity level of PIR sensors is not very high; hence the need to go for the fine level of detecting small creatures like rats.

Patel et al. (2021) explored a solar-powered IoT-based agricultural monitoring system that integrates solar panels to power wireless sensor networks and GSM communication devices. This approach significantly reduces reliance on conventional power sources and enhances the system’s feasibility in off-grid locations. The system monitors crop health, soil moisture, and environmental conditions using IoT sensors and transmits data wirelessly to the farmer’s mobile device. However, while this system addresses power-related challenges, it is not optimized for rodent detection or other farm surveillance needs, as its primary focus is on crop monitoring.

Adirala et al. (2025) devised an IoT-based detection system using PIR motion sensors to detect intrusions from large animals, such as cattle. The system employs a two-level deterrent mechanism namely randomized carnivorous animal sounds and high-intensity focus lights, and wireless connectivity for remote monitoring. However, it is limited to detecting larger animals and is not sensitive enough for small pests like rodents. Thus, a more advanced detection mechanism is required to monitor a wider range of potential animal intruders.

The above related studies have confirmed that solar-powered animal detection systems consume a lot of power. Previous studies have shown problem of low battery life to sustain the whole system infrastructure and functionalize effectively. Also, the models only detect and capture animals and other objects without classifying them. Our proposed model uses solar-charging Lithium-ion batteries that are convenient and reliable since they are easy to discharge and recharge. This makes the working of the system real-time and sustainability for long time. Additionally - using the YOLO algorithm - the proposed system not only detects the unwanted animals in farms but also classifies them into respective groups. Therefore, making it essential for decision making. For, instance the type and number of rodents detected enables farmers make informed decisions in determining more suitable protective and pest control measures.

3. Methods

In order to understand the proposed architecture, a simulation of image recognition was carried out using an image detection and classification pipeline. The pipeline uses OpenCV and YOLOv8 as computer vision frameworks for image detection, recognition, classification and counting of the animals. Data for model training was obtained from COCO consisting a list of various animals and objects. The simulation primarily focused on picking five groups of animals from the list, namely: rat, mouse, squirrel, chipmunk and mole. The five categories formed the classes of the rodent recognition model. After feeding in the training samples, the model was trained using 45 samples per class. The model was trained using sample images of rodents belonging to the 5 classes, which were accordingly recognized and classified. The results are displayed on a dashboard with performance metrics that are used to test and determine the reliability and efficiency of the proposed model.

3.1 System infrastructure

Figures 1 and 2 comprise the system infrastructure based on a motion sensor, flood light, motor, battery, speaker, rotating knob, a smart camera, a central database system and a solar panel.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure1.gif

Figure 1. Front view of the system.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure2.gif

Figure 2. Top (left) and back (right) view of the system.

3.2 System components

3.2.1 Motion sensor

Image-based motion sensors analyze changes in the captured video to visually identify movement using cameras and computer vision algorithms (Futagami et al., 2020). They are extremely accurate for this system because they use sophisticated algorithms for motion pattern detection and object tracking.

3.2.2 Data transmission protocol

WI-FI data transmission protocol facilitates data transfer between the system and the mobile application hosting the system’s GUI and database. WI-FI allows the transfer of data over a wide range (Pahlavan and Krishnamurthy, 2020), hence most convenient for this type of technology. When the system detects motion and the camera takes a video, and processes it, it will use WI-FI to notify the farmer of the intrusion.

3.2.3 Rotating knob

The rotating knob moves the entire surveillance equipment, allowing the smart camera to capture images from all angles within the monitored area. It is connected to a motorized rotational mechanism, which rotates the entire surveillance equipment horizontally. Users can rotate the knob manually to adjust the viewing angle of the surveillance equipment.

It is powered by the battery, enabling the motorized rotational mechanism and motion detection sensors to pause temporarily when motion is detected. Therefore, ensuring that the camera focuses on the detected movement to capture images or video footage.

3.2.4 Smart camera

According to Kurniawan and Sofiarani (2020), a smart camera combines complex image sensors and processors with the capability to process images on its own without assistance from humans. It records high-quality video footage of the areas under observation only once the motion is detected, enabling remote surveillance, evidence collection, and in-depth analysis of any activity that is detected.

Smart cameras are equipped with high-resolution imaging sensors which record clear and detailed video footage. These sensors might use technologies like charge-coupled devices (CCD) or complementary metal-oxide semiconductors (CMOS), which provide better image quality even in low light. The camera works with Infrared sensor light to capture objects in low-vision light or during the night, ensuring that the model runs 24 hours.

3.2.5 Flood light

Floodlight produces light illumination in low-light or nighttime conditions. It typically uses energy-efficient LED (Light-Emitting Diode) technology and it is switched on automatically once the motion sensor detects an object. LEDs deliver a high light output while using very little power energy, making them ideal for solar-powered systems where energy conservation is essential (Pulli et al., 2015). The floodlight is connected to the solar-charged battery, enabling it to draw power from it and operate effectively at night. This component is particularly useful for farm surveillance, as it can deter potential intruders and help the smart camera identify any activity occurring after dark.

3.2.6 Battery

The battery serves as the energy storage component in the system, ensuring a reliable and continuous power supply for the motion sensor, floodlight, and any other associated components. The preferred battery type for this system is lithium-ion batteries because they can be frequently charged and discharged without experiencing significant degradation and can be used for extended periods of time in solar-powered applications (Manthiram, 2017).

3.2.7 Speaker

The speaker is the audio output device within the surveillance system. Its main purpose is to mimic sound, allowing for alarming and alerting. According to Bernardini, Bianchi and Sarti (2023), speakers utilize transducer technology to change electrical signals into sound waves, which are then released by speakers.

The motion sensor and speaker are combined so that when motion is detected, the motion sensor sends a signal to close the circuit linking the speakers. This causes the speakers to activate and emit sound alarms to frighten away intruders.

Figure 3 below is a flowchart showing the integration of system components. It also illustrates the flow of events and processes within the model.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure3.gif

Figure 3. System integration Flow-chart diagram.

3.3 Model illustration

In order to understand the proposed architecture, a simulation of image recognition was carried out using an image detection and classification pipeline to train and simulate the rodent recognition model. Data for training the model was obtained from YOLO classes obtained from COCO dataset. The data was grouped into five categories, namely: rat, mouse, squirrel, chipmunk and mole. The five categories formed the classes of the rodent recognition model.

3.3.1 Image recognition process in OpenCV

Preprocessing techniques are first used to simplify and lower noise in the image, as shown in Figure 4. These techniques include first grayscale conversion and then Gaussian blurring. Thresholding is done to create a binary image that highlights the important objects. This binary image has contours that show the edges of distinct objects. Each contour’s area is computed, and contours having areas outside of a predetermined range are removed. For each remaining contour, bounding rectangles are generated in order to identify and pinpoint each particular object in the image (Duwal and Tamang, 2024).

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure4.gif

Figure 4. Flowchart showing image processing steps.

4. Theoretical framework

4.1 The Photovoltaic (PV) model

The proposed solar-powered motion sensor system ( Figure 5) operates autonomously by harnessing clean, renewable energy from the sun. The photovoltaic (PV) module is the cornerstone of this system, converting sunlight directly into electrical energy through the photovoltaic effect. A photovoltaic system is an array of PV modules that comprise a number of solar cells that generate electrical power.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure5.gif

Figure 5. Photovoltaic system.

4.2 Photon absorption

When sunlight, basically composed of photons, strikes the solar cells, the photons transfer their energy to electrons within the semiconductor material, hence exciting them (Vinod, Kumar, and Singh, 2018). This energy transfer is the initial step in converting solar energy into electrical energy. The photon absorption circuit is shown in Figure 6.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure6.gif

Figure 6. Photon absorption circuit.

4.3 Formation of electric potential in p-n junctions

Each solar cell contains a junction between two types of semiconductor materials: n-type and p-type. The junction between the n-type and p-type materials forms an electric field, causing the excited electrons to migrate to the p-type layer and leaving behind a static positive charge. Simultaneously, the holes wander across the junction, leaving behind a static negative charge. Eventually, a depletion zone forms at the junction, preventing further movement of charge carriers (Kirchartz and Rau, 2018).

The separated static positive and negative charges establish an electric field across the depletion zone. This field generates the voltage necessary to drive current through an external circuit. As the semiconductor continuously absorbs sunlight, the energy excites more electrons, causing them to jump to the conduction band and leave behind holes in the valence band (Chaudhery Mustansar Hussain, 2018). These freed electrons contribute to the electric current, moving toward the negative end, while holes move toward the positive end.

According to Chaudhery Mustansar Hussain (2018), when the absorbed photon energy exceeds the PV cell material's bandgap energy, the atoms in the semiconductor collide, freeing more electrons and generating an electric current. This process ( Figure 7) effectively converts sunlight into electrical energy, harnessing solar power for practical use.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure7.gif

Figure 7. Formation of electric potential in p-n junctions.

4.4 Solar charge controller

A Solar Charge Controller performs several vital functions to optimize performance and safeguard equipment. Yassine and Anderson (2020) state that it prevents overcharging by regulating the voltage and current supplied to the batteries during sunlight hours, preserving battery life and preventing damage. Additionally, it blocks reverse current flow from batteries to panels, ensuring energy generated by the panels doesn't drain back into the battery during low-light or nighttime conditions.

4.5 The Lithium-ion (Li-ion) battery

This type of battery is predominantly used in portable electronic devices, therefore very convenient to be used in the model to power the devices in the farm and server station (University of Washington, 2020). The Li-ion battery in the model consists of the anode, cathode, electrolyte, separator, positive current collector and negative current collector. Additionally, it has a charge-meter which displays the charge and discharge levels of the battery. The anode is the negative electrode that stores lithium and releases Li-ions during discharge whereas the cathode is the positive electrode that stores and releases Li-ions during charging. The electrolyte is liquid medium that allows for the flow of Li-ions from one electrode to the other. The anode and cathode are separated by a separator which allows free flow of Li-ions from one electrode to the other (US Department of Energy, 2023). Additionally, it prevents flow of electrons within the battery structure. The positive and negative current collectors act as the positive and negative terminals respectively (University of Washington, 2020) ( Figures 8, 9 and 10).

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure8.gif

Figure 8. The Lithium-ion battery.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure9.gif

Figure 9. Charging of battery.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure10.gif

Figure 10. Discharging of battery.

Charging:

Discharging:

4.6 The YOLO Algorithm

YOLO (You Only Look Once) is an image detection and categorization algorithm that was first developed by Redmon et al. in 2015 (Ali and Zhang, 2024). It uses a Single-Stage-Object Detection mechanism to detect real-time objects in a single-short through the network. This mechanism simultaneously performs region proposal and object classification, hence streamlining the image detection process.

Several enhancements have revolutionized the YOLO frame since its inception as YOLOv1 to YOLOv11. The difference in the framework versions is based on enhanced accuracy, efficiency, and speed in detecting real-time data. This is due to a reduction in latency over each advancement, hence an improvement in accuracy (Ali and Zhang, 2024). Figure 11 below illustrates the general architecture of a YOLO SSD detector.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure11.gif

Figure 11. Structure of a single-shot detector.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure12.gif

Figure 12. Frames Per Second Over Time.

Table 1 below summarizes the YOLO version and year of their establishment.

Table 1. YOLO variants.

Year of establishment YOLO variant
2015YOLOv1
2016YOLOv2
2018YOLOv3
2020YOLOv4, YOLOv5
2021YOLO X, YOLO R
2022YOLOv6, YOLOv7
2023YOLOv8, YOLO NAS
2024YOLOv9, YOLOv10, YOLO v11

YOLO performs object detection by dividing an image into a grid and simultaneously predicting both class probabilities and bounding boxes. A deep-learning Convolutional Neural Network (CNN) performs feature extraction to generate these class probabilities and bounding boxes, through the convolutional layers. Detection of varying object sizes is enabled through anchor boxes at various scales. Finally, Non-Maximum Suppression (NMS) filters out low confidence and redundant predictions to refine the final predictions (Jiang et al., 2022). Therefore, YOLO is a reliable and highly efficient framework to be used in our model for animal detection.

5. Results & Discussion

The following Key Performance Indicators (KPIs) or metrics were used to test the efficiency of the model. The graphs display analytical results as obtained from the training and testing of the model.

5.1 Frames Per Second (FPS)

FPS shows system’s processing speed in real-time. X-axis shows the frame number from time to time while Y-axis shows the number of frames every second. The results display a high FPS (>20), indicating smooth running of the system, hence convenient for real-time monitoring.

5.2 Animal Detection Distribution

This shows the type of detected animals together with their frequencies. From the results in Figure 13, the total number of animal detection was 49, Mice have the highest frequency of detection (19 counts) while Moles, Rats, Squirrels and Chipmunks have a count of 8,7,7,8 respectively. Farmers therefore, use the data as a guide in decision-making, assessing what measures to implement to prevent intrusion.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure13.gif

Figure 13. Animal Detection Distribution.

5.3 Classification Confidence Distribution

The Classification and Confidence Distribution show the confidence score of the system in perform animal classification. According to the results as seen in Figure 14, most detections scores 0.8-0.85, with the highest frequency recorded at 10 while lowest frequency at 1. A detection score of >0.8 justifies that the system is very reliable.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure14.gif

Figure 14. Confidence Distribution.

5.4 Detections Per Frame

X-axis shows the frame number while the Y-axis shows the number of animal detections per frame. The Peaks show that there is a period of high animal activity; valleys show a reduction in animal activity or total absence; consistent line in frame progression shows a steady or constant animal activity; whereas spikes show sudden or irregular movement of animals into the frames. According to the results in Figure 15 above, the peak recorded 3 detections; most consistent detection of 2; low activity at 1; and absence of activity at 0. From the results, it is evident that the model is realistically applicable to capture both scenarios of rodent invasion and absence.

c0356bd8-31f2-4bed-8e6a-115a5a7fa85b_figure15.gif

Figure 15. Detection Per Frame.

6. Conclusion and Future Recommendation

The integration of computer vision technology and the Lithium-ion battery as a power source will enable the proposed system carry out automated monitoring and surveillance on farms by providing a method to detect and classify rodents without manual intervention. The use of solar energy and autonomous running of the system lowers labor expenses, and improves farm management efficiency. The system’s use of Li-ion batteries makes it reliable and self-sufficient in powering the model components throughout the day and during night. The YOLOv8 algorithms has proven to be reliable and efficient in detection and classification of the animals for prompt action. This follows its characteristic of single-shot detection, consuming less energy and fast processing of the data. This technology also improves security and productivity by precisely identifying and chasing away the animals; allowing for prompt responses to possible threats or emergencies, such as land degradation and rodent intrusion in farms. Our model not only contributes to the field of AI and ML, but also has been a cornerstone to the use of green energy, which is a contributor to sustainable energy solutions. Therefore, our system’s applicability and implementation are justified to be reliable and convenient and should be used in agriculture as a technological-advancement approach. However, future research should focus on carrying out the real implementation on real-time data in the farms.

Ethics approval

Not applicable.

Comments on this article Comments (2)

Version 3
VERSION 3 PUBLISHED 04 Nov 2025
Revised
Version 1
VERSION 1 PUBLISHED 25 Jun 2025
Discussion is closed on this version, please comment on the latest version above.
  • Author Response 10 Sep 2025
    Marwan Albahar, $usrAffiliation
    10 Sep 2025
    Author Response
    Reviewer comments

    Comment Answer

    1 Replace simplified ML modeling with rigorous, transparent, and reproducible approaches
    Using GTM is just demonstration of the model since it is proposed. The focus ... Continue reading
  • Reader Comment 10 Sep 2025
    Mohammad Alshehri , Taif University, Taif, Saudi Arabia
    10 Sep 2025
    Reader Comment
    The article presents a timely and innovative solution for farm surveillance by integrating solar power, motion sensing, and AI-based rodent detection. The combination of OpenCV with a solar-powered autonomous setup ... Continue reading
  • Discussion is closed on this version, please comment on the latest version above.
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Gazzawe F and Albahar M. Solar-powered motion sensor for farm monitoring and surveillance: a solar-powered assisted process [version 3; peer review: 2 approved with reservations]. F1000Research 2025, 14:624 (https://doi.org/10.12688/f1000research.164633.3)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 2
VERSION 2
PUBLISHED 16 Sep 2025
Revised
Views
13
Cite
Reviewer Report 09 Oct 2025
G. Radhika Deshmukh, Shri Shivaji Scirnce College, Amravati, Maharashtra, India 
Approved with Reservations
VIEWS 13
A brief summary of the peer review assessment for the article:
  1. Clarity and Literature Citation: Yes
    The work is well-structured, clear in presenting aims and results, and cites recent, relevant literature.
  2. Study
... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Deshmukh GR. Reviewer Report For: Solar-powered motion sensor for farm monitoring and surveillance: a solar-powered assisted process [version 3; peer review: 2 approved with reservations]. F1000Research 2025, 14:624 (https://doi.org/10.5256/f1000research.187475.r415205)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Version 1
VERSION 1
PUBLISHED 25 Jun 2025
Views
8
Cite
Reviewer Report 18 Aug 2025
Roshahliza M. Ramli, University Malaysia Pahang Al-Sultan Abdullah, Pahang, Malaysia 
Approved with Reservations
VIEWS 8
Suggestions to improving the paper:
  • Replace simplified ML modeling with rigorous, transparent, and reproducible approaches.
  • Improve experimental design and system evaluation with real-world benchmarks.
  • Broaden literature review with more recent and high-impact
... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Ramli RM. Reviewer Report For: Solar-powered motion sensor for farm monitoring and surveillance: a solar-powered assisted process [version 3; peer review: 2 approved with reservations]. F1000Research 2025, 14:624 (https://doi.org/10.5256/f1000research.181174.r394971)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (2)

Version 3
VERSION 3 PUBLISHED 04 Nov 2025
Revised
Version 1
VERSION 1 PUBLISHED 25 Jun 2025
Discussion is closed on this version, please comment on the latest version above.
  • Author Response 10 Sep 2025
    Marwan Albahar, $usrAffiliation
    10 Sep 2025
    Author Response
    Reviewer comments

    Comment Answer

    1 Replace simplified ML modeling with rigorous, transparent, and reproducible approaches
    Using GTM is just demonstration of the model since it is proposed. The focus ... Continue reading
  • Reader Comment 10 Sep 2025
    Mohammad Alshehri , Taif University, Taif, Saudi Arabia
    10 Sep 2025
    Reader Comment
    The article presents a timely and innovative solution for farm surveillance by integrating solar power, motion sensing, and AI-based rodent detection. The combination of OpenCV with a solar-powered autonomous setup ... Continue reading
  • Discussion is closed on this version, please comment on the latest version above.
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.