Skip to main content
MethodsX logoLink to MethodsX
. 2025 Aug 22;15:103579. doi: 10.1016/j.mex.2025.103579

AI-powered automated hydroponic system for smart agriculture

Prof Trupti Baraskar 1,, Viren Khatri 1, Parimal Kolhe 1, Mitheelesh Katyarmal 1, Shaunak Khedekar 1
PMCID: PMC12423342  PMID: 40949829

Abstract

This research presents an AI-powered automated hydroponic system designed to enhance the efficiency and sustainability of modern agriculture. The system integrates real-time environmental monitoring, automated nutrient management, and AI-based disease detection to optimize plant growth and minimize manual intervention. An ESP32 microcontroller collects data from specialized sensors measuring Total Dissolved Solids (TDS), pH, temperature, and light intensity. Data is wirelessly transmitted via MQTT to an EMQX broker, subsequently processed by an ExpressJS backend, and stored in a Firebase Realtime Database. A NextJS web application provides a user-friendly dashboard for visualization, alerts, and remote control. Automation is achieved using relay-controlled peristaltic and water pumps that adjust nutrient dosing and circulation based on sensor readings. A camera module captures plant images, which are analyzed by a CNN model running on a separate AI server to detect common spinach diseases like Anthracnose and Downy Mildew, enabling early intervention. This integrated system combines IoT, cloud data management, automation, and AI-based visual inspection to offer a comprehensive solution for precision hydroponic farming. Evaluation demonstrates high accuracy in disease detection, robust system performance, and significant potential for improving crop health, yield, and reducing manual labor in diverse agricultural settings. The system, along with its full codebase, has been made publicly available to promote reproducibility.

  • Automated Precision Hydroponics: Combines real-time environmental monitoring, automated nutrient management, and AI-powered disease detection for optimized spinach cultivation.

  • Reproducible and Scalable Method: Provides a detailed, step-by-step protocol for constructing and operating the system, adaptable to various hydroponic setups and crop types.

  • Sustainable and Efficient Agriculture: Minimizes resource consumption, reduces manual labour, and promotes environmentally friendly practices.

Keywords: Hydroponics, Smart Agriculture, IoT, Convolutional neural network (CNN), Disease detection, Full stack development

Graphical abstract

Image, graphical abstract


Specifications table

Subject area Engineering
More specific subject area Agricultural Technology, Precision Agriculture, Plant Disease Detection, Automated Hydroponics
Name of your method Construction and Operation of an AI-Powered Automated Hydroponic System for Spinach Cultivation and Disease Detection
Name and reference of original method This method is novel and does not directly adapt any previously published protocols.
Value of the Protocol
  • Optimized Growth and Resource Efficiency: Automated control of environmental parameters and AI-driven disease detection maximize yields while minimizing resource use, making the system ideal for diverse settings.

  • Reduced Labor and Proactive Crop Management: Automation minimizes manual tasks, while data-driven insights and remote monitoring enable proactive crop management and improved quality.

  • Scalable, Reproducible, and Sustainable Solution: The modular, reproducible design promotes scalable precision agriculture and contributes to sustainable food production.

Background

The global demand for food is projected to increase significantly in the coming decades, placing immense pressure on existing agricultural practices to produce more with less. Factors such as climate change, water scarcity, and the shrinking availability of arable land pose significant challenges to conventional agriculture [1]. Hydroponics, a soilless cultivation technique [2], presents a viable alternative by enabling higher yields, efficient water usage, and controlled environment cultivation, decoupling food production from traditional limitations. However, conventional hydroponic systems often necessitate significant manual input for tasks like monitoring nutrient solutions, adjusting environmental parameters, and visually inspecting plants for diseases. This reliance on manual labour hinders the scalability, efficiency, and accessibility of hydroponics, particularly in urban settings and regions with limited labour resources [3].

This methodology aims to address these constraints by integrating automation and artificial intelligence into hydroponic cultivation. Our motivation stems from the need to develop a robust and reproducible protocol for constructing and operating an AI-powered automated hydroponic system. This system is designed to optimize plant growth by precisely controlling environmental factors, minimize manual intervention through automation, and empower data-driven decision-making through continuous monitoring and AI-based analysis [4]. This approach strives to enhance the productivity, sustainability, and overall feasibility of hydroponic farming, making it a more attractive and accessible option for a wider range of users, from hobbyists to large-scale commercial producers.

The core components of this methodology include real-time environmental monitoring using a suite of sensors, automated nutrient and water management through precisely controlled pumps, and AI-based disease detection via image analysis [5]. By automating these crucial aspects of hydroponic cultivation, the protocol reduces the demands on human labour and allows for finely tuned adjustments to growing conditions, responding dynamically to plant needs. The incorporation of a deep learning model for image-based disease diagnosis further elevates the system's capabilities. Early and accurate disease identification enables prompt interventions, minimizing crop losses and reducing the need for broad-spectrum chemical treatments, promoting healthier and more sustainable cultivation practices.

This methodology is built upon principles of adaptability and scalability. The modular nature of both the hardware and software components allows for flexible configuration and expansion to suit diverse hydroponic system designs, crop varieties, and scales of implementation. The detailed, step-by-step procedures outlined in the protocol ensure reproducibility, enabling researchers and practitioners to replicate the system with ease and tailor it to their specific requirements. The use of readily available, open-source software and affordable hardware components further enhances accessibility, lowering the barriers to entry for individuals and communities interested in adopting advanced hydroponic techniques.

Although this specific method focuses on spinach cultivation as a model crop, the underlying principles and technological components can be readily adapted to other leafy greens and, with further development, extended to a broader range of crops. The system's robust data acquisition capabilities provide valuable insights into plant growth dynamics and system performance. This rich dataset can be leveraged for further analysis, refinement of control algorithms, and optimization of growing parameters, leading to continuous improvement in yield, resource utilization, and crop quality. By disseminating this comprehensive and readily implementable methodology, we aim to foster wider adoption of AI and automation within the field of hydroponics and contribute to the advancement of sustainable and resilient agricultural practices for future food security.

Recent research highlights the potential of the Artificial Intelligence of Things (AIoT) to revolutionize hydroponic farming. Rahman et al. [6] developed an AIoT-based system for crop recommendation and nutrient parameter monitoring. Their system utilizes IoT sensors to collect real-time data on critical parameters like nutrient levels (N, P, K), pH, temperature, and humidity, transmitting this information to a cloud server for analysis. Leveraging a machine learning model trained on a dataset from the Indian Chamber of Food and Agriculture, the system recommends suitable crops based on the provided parameters and suggests adjustments to optimize the nutrient solution for improved growth. The researchers validated their approach through a lettuce cultivation test using Nutrient Film Technique (NFT) and Tower Garden methods, developing a user-friendly web interface for data input and crop recommendations. However, resource constraints limited the full implementation of automated monitoring and recommendations, necessitating manual data collection. Furthermore, the system's reliance on specific datasets restricts its generalizability and warrants further investigation with diverse crops and growing conditions.

Another study by Mehra et al. [7] proposed an intelligent IoT hydroponic system employing deep neural networks for real-time control of plant growth. Their system uses sensors to monitor key environmental parameters, feeding the data to an edge device (Arduino and Raspberry Pi3) for processing. A deep neural network model, trained on historical data, predicts optimal growth conditions and adjusts system parameters accordingly. A cloud-based classification system supports data storage, analysis, and remote monitoring. Their results, demonstrating an 88 % accuracy rate in controlling tomato plant growth, highlight the power of deep learning in hydroponic systems. However, the computational intensity and "black box" nature of deep neural networks present challenges for scalability and explainability, respectively. The study's focus on tomato plants also limits the understanding of its applicability to other crops with varying nutrient requirements.

Both studies underscore the transformative potential of AIoT in hydroponic farming, offering real-time monitoring, automated control, and data-driven decision-making. These capabilities contribute to optimized nutrient solutions, precise control over environmental parameters, and data-informed crop recommendations, ultimately promoting resource efficiency and improved yields [8]. However, limitations related to data dependency, scalability, cost, and the need for more robust and generalizable models must be addressed in future research. Further exploration of user-friendly interfaces, security protocols, and comprehensive comparative studies is crucial for wider adoption and realizing the full potential of AIoT-driven hydroponic systems for sustainable and efficient food production.

Method details

Hydroponics, a method of growing plants without soil, is gaining traction as a sustainable alternative to traditional farming because of its efficient use of water, space, and nutrients, along with its ability to support year-round cultivation [9]. As urban farming and controlled environment agriculture expand, there is a growing need for systems that not only automate plant growth but also provide reliable monitoring and early detection of crop health issues. Advances in the Internet of Things (IoT), cloud computing, and artificial intelligence (AI) offer an opportunity to transform hydroponics into a fully automated and intelligent process [10]. Prior research has explored IoT-driven nutrient and environmental monitoring, and separate efforts have applied AI for disease detection in crops, yet very few approaches have combined these elements into a complete, reproducible framework [11]. This method addresses that gap by presenting an integrated hydroponic system that unifies ESP32-based sensing, real-time communication via MQTT, cloud-enabled data management, and AI-powered disease detection through a convolutional neural network [12]. By openly sharing code, configurations, and pretrained models, the system provides a practical, end-to-end platform that can be adapted for both academic research and real-world agricultural use.

Fig. 1 demonstrates the sensors and user commands are processed through the ESP32 microcontroller, which communicates with the EMQX MQTT broker to manage device control (e.g., pump triggers). Real-time sensor readings and control signals are synchronized with Firebase for persistent storage. The ExpressJS backend coordinates data exchange between components, sending sensor data to the NextJS frontend for visualization and relaying plant leaf images to the AI server for disease classification. Analysis results from the CNN model are returned to the frontend, enabling users to monitor crop health and control devices in real time. This architecture highlights the integration of hardware, communication protocols, AI inference, and user interaction into a reproducible, end-to-end pipeline.

Fig. 1.

Fig 1

System Architecture Diagrams for the AI-Powered Automated Hydroponic System.

Stage 1: hardware setup – building the physical infrastructure

We began by constructing the core of our system: the vertical hydroponic tower. We chose this design for its efficient use of space, crucial for our indoor setting, and its suitability for growing leafy greens like spinach. We opted for a commercially available tower, though constructing one from food-grade PVC pipes and containers is entirely feasible, offering greater customization. Regardless of the approach, ensuring the materials are food-safe is essential to prevent leaching harmful substances into the nutrient solution. For a DIY approach, PVC pipes with diameters of 4–6 inches work well.

Individual planting pockets were created by drilling holes along the length of the vertically arranged PVC pipes, sized appropriately for net pots or other chosen planting containers. The base of our tower was securely connected to the reservoir, with careful attention to creating a watertight seal.

A submersible pump, placed within the reservoir, served to circulate the nutrient solution. This pump is key for delivering nutrients and oxygen to the plant roots, preventing stagnation and promoting healthy growth [13]. The pump pushes the solution to the top of the tower, from where it flows down through the planting pockets and back into the reservoir for continuous recirculation.

For DIY setups, a sturdy supporting frame built from wood, metal, or another suitable material is essential for stabilizing the tower and reservoir and preventing spills.

Accurate and consistent monitoring was a priority, so we integrated several sensors into our setup. A pH sensor, TDS sensor, and EC sensor provided essential data on the nutrient solution’s acidity, concentration, and ionic strength, all connected to the ESP32’s analog input pins. These parameters are critical for nutrient availability and plant health. For temperature monitoring, we used a DS18B20 for precise nutrient solution temperature readings (placing it directly in the solution) and an AHT25 for ambient temperature and humidity.

An RTC module was included to ensure accurate timestamps for data logging and scheduling, even if the ESP32 lost power. Finally, an LTR390 light sensor, positioned at canopy level, measured light intensity, a key factor in photosynthesis and plant growth.

Automation of control was achieved through a combination of pumps and relays. Peristaltic pumps, known for their precise dosing and backflow prevention, were connected to relay modules for automated nutrient delivery. Similarly, the main water pump, responsible for circulating the nutrient solution, was also relay-controlled. These relays act as switches, controlled by the ESP32, and importantly, isolate its low-voltage circuitry from the higher pump voltages.

It is critical to choose relays rated for the specific voltage and current demands of the pumps. We also incorporated optional LED grow lights, connected via a relay, to supplement or control lighting conditions, particularly beneficial in indoor settings.

For early disease detection, we set up a camera module [14]. Our AI model's complexity and processing speed needs dictated our choice between an ESP32-CAM (for simpler models) and a Raspberry Pi with a dedicated camera (for more demanding tasks). High-resolution images are vital for accurate disease identification, so camera selection is an important consideration. Consistent lighting and camera angle are also crucial for reliable image analysis, so we used a fixed mount and supplementary lighting to standardize image capture.

Finally, we addressed the practical aspects of power and protection. A regulated 12 V SMPS power supply provided stable power to all components, preventing damage from voltage fluctuations. Step-down converters were employed to provide appropriate lower voltages where necessary. We enclosed all electronics in a weatherproof enclosure to shield them from environmental factors, ensuring the longevity and reliability of the system.

For networking, both the ESP32 and the AI server were connected to our local network via Wi-Fi. A dedicated router or access point is advisable to maintain a stable connection, minimizing interference from other devices. If signal strength is an issue, consider a Wi-Fi range extender to ensure good coverage throughout the growing area.

For water consumption tracking, we integrated a water level sensor into our reservoir. A simple float sensor connected to the ESP32 provided a cost-effective solution. Careful calibration of this sensor was necessary for accurately detecting the full and empty states of the reservoir.

Within the ESP32 firmware, we implemented logic to record timestamps when these states were triggered. This data, coupled with the known reservoir volume, allowed us to calculate and track water consumption rates over time.

Fig. 2 illustrates the low-level architecture of the hardware module within the automated hydroponic system. It shows the various sensors used for monitoring environmental parameters (pH, TDS, water temperature, light intensity, ambient temperature and humidity), the ESP32 microcontroller that serves as the central processing unit, and the actuators (peristaltic pumps for nutrient and pH adjustment, and a water pump for circulation) responsible for controlling the hydroponic environment. The RTC module provides real-time scheduling and timestamps. The TFT display shows the local system status. Data flow from sensors and the RTC module to the microcontroller, as well as control signals from the microcontroller to the actuators and display, are also depicted.

Fig. 2.

Fig. 2

Low-Level Hardware Module Diagram for Automated Hydroponic System.

Stage 2: software setup – the brains of the operation

  • A. ESP32 Firmware Development (Arduino IDE) – The Embedded Controller

We developed the ESP32 firmware using the Arduino IDE for its ease of use and widespread accessibility. This makes our method easier to replicate for those new to embedded systems programming, even though environments like PlatformIO offer more advanced features. We installed several key libraries: WiFi.h for network connectivity, PubSubClient.h to enable MQTT communication, DHT.h for our temperature and humidity sensor, and specific libraries for any other sensors in our setup. Leveraging these libraries streamlined our code significantly.

The firmware we wrote handles several critical tasks. It reads data from all connected sensors at intervals we defined, striking a balance between data granularity and power consumption. We then processed this raw sensor data, converting it into meaningful units (like pH and temperature in Celsius) and formatting it into JSON payloads for efficient transmission, including timestamps from the RTC module for later analysis and visualization. Using the PubSubClient library, the ESP32 connects to the MQTT broker, publishing sensor data to designated topics (e.g., esp32/sensors, esp32/waterlevel) and subscribing to the backend/pumpCommands topic for control instructions. When commands arrive via MQTT, the firmware activates or deactivates relays to control pumps and other actuators, incorporating safety checks to prevent accidental triggers. As described in Stage 1, the firmware also reads the water level sensor, tracks full/empty states, calculates the consumption rate, and publishes this data. Robust error handling is built-in to address sensor read errors, MQTT connection failures, etc., logging these events to the serial monitor or a remote logging service (if available). A watchdog timer is another useful addition for system stability, resetting the ESP32 if it becomes unresponsive.

  • B. Backend Server Development (ExpressJS, Node.js) – The Central Hub

For the backend server, we chose ExpressJS, a popular and well-documented Node.js framework that streamlines API development. We installed the express, mqtt, firebase-admin, cors (for handling cross-origin requests if needed), and body-parser packages.

We then created several API endpoints to handle various requests. GET /sensorData retrieves sensor data from Firebase, allowing for filtering by date/time range, and we implemented pagination to efficiently handle larger datasets. POST /uploadImage receives image uploads from the frontend, temporarily storing them securely before triggering the AI server for analysis. POST /pumpCommand receives control commands, validating them before publishing them to the MQTT broker. GET /aiResults retrieves AI analysis results from Firebase, and GET /waterConsumption retrieves the ESP32′s calculated water usage data. We used the ws library for WebSocket integration, creating a persistent connection with the frontend for real-time, bi-directional communication. This allows us to push live sensor readings, AI results, and water consumption data to the dashboard. Finally, using the Firebase Admin SDK, we integrated with the Firebase database, structuring it logically to store sensor data, AI results, and system configurations.

Fig. 3 details the architecture and data flow of the backend server, built using ExpressJS and Node.js, which serves as the central hub for the automated hydroponic system. It illustrates how the ExpressJS backend interacts with various components: subscribing to device topics and storing incoming IoT data from the EMQX MQTT broker, sending real-time updates to the Next.js frontend web application via WebSockets, and persisting data to the Firebase Database using the Firebase SDK. The diagram also shows how the frontend retrieves historical sensor data from Firebase. The API layer, comprising REST, WebSocket, and Firebase SDK elements, is shown providing structured access to the backend functionalities and data. This architecture enables efficient data management, real-time communication, and seamless integration between the hardware, AI components, and user interface.

  • C. AI Server Development (Python, TensorFlow, FastAPI) – The Intelligent Analyzer

Fig. 3.

Fig 3

Backend Server and Data Flow Architecture Diagram.

Using TensorFlow, Keras, and FastAPI, we created an AI-based backend system to create a reliable [15] and instantaneous AI Model for classifying spinach leaf diseases [16]. From data preprocessing to model deployment, every essential step of this server pipeline is described in detail in this section.

  • 1 Image Preprocessing

Image preprocessing is a critical step in the pipeline to ensure uniformity across the dataset and compatibility with the input expectations of the convolutional neural network (CNN) [17]. All input images collected from the spinach leaf dataset are originally in various resolutions, file formats, and aspect ratios. Therefore, several standard preprocessing operations were conducted:

    • Resizing: All images were resized to a fixed dimension of 128 × 128 pixels, matching the input shape required by the CNN model. This resolution was selected based on a balance between computational efficiency and feature retention.
    • Color Normalization: Images were loaded in RGB format, and each pixel value was rescaled from the original 0–255 range to a [0, 1] range. This was achieved by dividing all pixel values by 255.0. Normalization helps accelerate convergence during training and maintains consistency with the training data distribution during inference.
    • Batch Formatting: Before passing to the model, each image was converted to a 3D NumPy array and then expanded to a 4D batch format using np.expand_dims. This is required because TensorFlow expects batches of images, even when predicting a single sample.
    • Memory-Efficient Processing: Instead of saving preprocessed images to disk, we employed in-memory preprocessing using the Pillow and io.BytesIO libraries. This minimizes disk I/O and accelerates API response time during live predictions.
  • 2 Dataset Source

The spinach disease dataset was obtained from the publicly available Mendeley Data repository: https://data.mendeley.com/datasets/n56pn9fncw/2.

  • 3 Image Pipelining Using Keras Generators

To streamline the training and validation process, we used ImageDataGenerator from Keras, which automates image loading and rescaling in batches.

  • Training Generator: The training images were loaded using a generator configured with rescale=1./255. No data augmentation was applied in the final model to focus strictly on feature learning.

  • Validation & Test Generators: Similarly, validation and test sets were loaded using identical rescaling settings. The test generator was configured with shuffle=False to preserve the order of labels for consistent evaluation.

  • Batch Size: A batch size of 32 was used, which is a standard compromise between GPU utilization and training speed.

  • Class Mode: The class_mode='categorical' was specified to support multi-class classification with one-hot encoded labels.

  • 4 CNN Model Development

The convolutional neural network was architected to extract hierarchical features and perform three-class classification (Anthracnose, Downey-Mildew, Healthy-Leaf). A summary of the layers and parameters is shown below:

Table 1 details the architecture of a Convolutional Neural Network (CNN) used for classifying plant leaf images into disease categories. The model accepts RGB images of size 128 × 128 × 3 and passes them through successive convolutional, pooling, and dropout layers to extract hierarchical features. A series of Conv2D layers with ReLU activations are used for feature extraction, interleaved with MaxPooling and Dropout layers to reduce spatial dimensions and prevent overfitting. The final flattened output is fed into dense layers, concluding with a Softmax layer that outputs probabilities across three disease classes. The network consists of approximately 6.5 million trainable parameters.

Table 1.

CNN Architecture for Leaf Disease Classification.

Layer Output Shape Parameters Notes
Input (128, 128, 3) 0
Conv2D (3 × 3, 64) (126, 126, 64) 1792 ReLU activation
MaxPooling2D (2 × 2) (63, 63, 64) 0
Conv2D (3 × 3, 64) (61, 61, 64) 36,928 ReLU activation
MaxPooling2D (2 × 2) (30, 30, 64) 0
Dropout (rate 0.1) (30, 30, 64) 0 Reduces overfitting
Conv2D (3 × 3, 128) (28, 28, 128) 73,856 ReLU activation
MaxPooling2D (2 × 2) (14, 14, 128) 0
Dropout (rate 0.2) (14, 14, 128) 0 Regularization
Flatten (25,088) 0 Converts to 1D vector
Dense (256) (256) 6422,784 ReLU activation
Dropout (rate 0.2) (256) 0 Regularization
Dense (3, Softmax) (3) 771 Output probabilities for 3 classes
Total Parameters 6536,131

Table 2 outlines the key hyperparameters used to train the leaf disease classification model. The model is optimized using the Adam optimizer with categorical crossentropy as the loss function and accuracy as the evaluation metric. Training is performed over 10 epochs with a batch size of 32. The input images are resized to 128 × 128 pixels. Dropout regularization is applied at rates of 0.1 after the second convolutional block and 0.2 elsewhere. Minimal data augmentation is used, limited to rescaling of image pixel values.

Table 2.

Model Training Hyperparameters.

Hyperparameter Value
Optimizer Adam
Loss Function Categorical Crossentropy
Evaluation Metric Accuracy
Batch Size 32
Epochs 10
Input Size 128 × 128
Dropout Rates 0.1 (after 2nd conv block), 0.2 (others)
Data Augmentation None (rescale only)

The CNN model architecture was intentionally kept lightweight to enable real-time inference and deployment on modest hardware. This balance between model complexity and performance ensures practical usability in low-resource environments [18]. Future improvements could explore deeper architectures or transfer learning (e.g., MobileNet, ResNet) to enhance classification accuracy across more diverse plant diseases.

  • 5 Deployment via FastAPI

To expose the model as a RESTful service:

  • a.
    Startup Routine
    • The HDF5 model file (best_model.h5) is loaded once at server launch.
    • A dictionary mapping numeric class indices to labels is initialized (e.g., 0 → Anthracnose).
  • b.
    Prediction Endpoint (POST /predict)
    • Input Validation: Ensures uploaded content is an image (rejects other MIME types with HTTP 400).
    • Preprocessing: Performs the same resize, normalize, and batch formatting steps in memory.
    • Inference: Invokes model.predict(), selects the highest-probability class, and maps it to a label.
    • Response: Returns a lightweight JSON object, e.g., {"predicted_label": Downey-Mildew}
  • c.
    Error Handling
    • Non-image uploads result in explicit HTTP 400 errors.
    • Unexpected failures (e.g., corrupted image bytes) yield HTTP 500 with diagnostic details.
  • d.
    Scalability & Integration
    • Served via Uvicorn (ASGI), enabling asynchronous request handling and horizontal scaling behind Kubernetes or Docker Swarm.
    • The API can be invoked by web dashboards, mobile apps, or embedded devices for near-real-time field diagnostics.

Fig. 4 illustrates the low-level design of the AI Model for leaf disease detection. The process starts with a LeafImage input passed to the AIModel’s classify() method. The image undergoes preprocessing, followed by inference through a CNN-based DiseaseDetectionModel. The prediction output is encapsulated in an AnalysisResult, indicating disease type and confidence score. This modular design supports automated image analysis and disease identification within the system.

  • D. Frontend Dashboard Development (NextJS, React) – The User Interface

Fig. 4.

Fig 4

Low Level design of AI Model.

We chose NextJS to build our frontend dashboard, combining the benefits of React with features like server-side rendering and improved performance. This framework choice aligns with our goal of creating a responsive and efficient user interface. Key dependencies we installed included react, next as the core UI libraries, chart.js for visualizing sensor data, and socket.io-client to handle real-time communication with the backend via WebSockets [19]. Additionally, a UI component library such as Material UI or Ant Design helped us create a polished and consistent interface more quickly.

The dashboard itself is composed of several key components. The main Dashboard component displays real-time sensor data using interactive charts. We used line charts to visualize time-series data from sensors like pH, temperature, and light intensity, making it easy to track trends and spot anomalies. WebSockets play a crucial role here, enabling the charts to dynamically update as new data arrives from the backend.

We also implemented an Image Upload component, which allows users to select and upload images of their spinach plants for disease analysis. Security considerations were paramount here; we validated file types and sizes on both the frontend and backend to prevent any potential vulnerabilities. The Control Panel component provides manual control over the hydroponic system. We included clearly labeled buttons or toggles for activating or deactivating the pumps and other actuators. Clicking these controls sends commands to the backend's /pumpCommand API endpoint. The AI Results Display component presents the disease prediction and confidence score received from the AI server, giving users clear and actionable information. To provide further context, we displayed the uploaded image next to the prediction. Finally, the Water Consumption Tracking component presents both the current water consumption rate and a historical view of water usage, aiding in resource management. WebSocket integration was essential here as well, ensuring the dashboard reflects the latest data and system status in real time. The socket.io-client library facilitated this connection to the backend's WebSocket server, providing a seamless channel for data updates to flow to the frontend and enhancing the overall user experience.

Stage 3: system integration, operation, and data collection

  • A.

    System Integration and Calibration

We first calibrated each sensor (pH, TDS, EC, temperature, and water level) individually using the manufacturer's recommended calibration solutions and procedures. Accurate sensor readings are fundamental to reliable system performance, so this step is crucial. We then set up our MQTT broker (EMQX), configuring connection parameters and defining the necessary topics—esp32/sensors, backend/pumpCommands, esp32/waterlevel—for communication between the ESP32, backend, and frontend. We chose MQTT because it's a lightweight and efficient messaging protocol well-suited for IoT systems like ours. Next, we confirmed the connection between our ExpressJS backend and the Firebase database, ensuring the data structures in Firebase were correctly defined to store the incoming sensor readings, AI predictions, and water consumption data.

Firebase's real-time synchronization capabilities simplified our backend development considerably. With the database configured, we integrated our AI server. We deployed the trained CNN model and tested the /predict endpoint thoroughly with sample images before integrating it with the backend's /uploadImage functionality. This end-to-end connection was critical for seamless image-based disease detection. Finally, we performed a full system test, sending test data from the ESP32 to verify its arrival at the backend and storage in Firebase. We also triggered test image uploads, ran them through the AI analysis pipeline, and confirmed the results were correctly displayed on the frontend dashboard. This comprehensive testing allowed us to identify and resolve any integration issues before proceeding to full system operation.

  • A.

    System Operation and Data Collection

With the system fully integrated, we began operation by powering on the hydroponic setup and initiating all software components: ESP32 firmware, backend server, AI server, and the frontend application. The sequence here is important; the MQTT broker and backend server must be running before the ESP32 attempts to connect. We then used our frontend dashboard to monitor sensor readings in real time, verifying their accuracy against independent measurements. Continuous monitoring [20] gave us immediate insight into the system's performance and the health of our spinach plants. Next, we activated the automated control logic in the backend. This logic governed automated nutrient dosing, maintaining the water level, and, in our case, controlling the LED grow lights – all based on predefined thresholds and rules we had established. This automation minimized the need for manual adjustments, ensuring consistent growing conditions.

For disease detection, we regularly captured images of the spinach leaves using the camera module, uploading them through the dashboard interface. The AI model's predictions – disease classifications and confidence scores – were then displayed on the dashboard, allowing us to monitor plant health and intervene early if necessary. While the system was largely automated, we retained manual control options through the dashboard. This provided the flexibility to override automated processes and make specific adjustments as needed. Throughout the operation, the backend server logged all sensor readings, AI predictions, and water consumption data to the Firebase database. We could then easily retrieve this data for analysis and visualization, gaining insights into plant growth patterns, system performance, resource usage, and the effectiveness of our automated control strategies. This data-driven approach is invaluable for continuous system optimization and refinement of our growing strategies

Code and resource availability

To support reproducibility and open research, all source code and resources developed in this project have been made publicly available:

Method validation

  • AI Model Accuracy: The trained CNN model was evaluated using a separate test dataset, unseen during training or validation. Evaluation was conducted through several means:
    • Quantitative Accuracy: The model achieved ∼81.3 % accuracy on the test set, indicating a strong ability to generalize to new images.
    • Confusion Matrix: A confusion matrix was plotted using Seaborn to visualize true positives, false positives, and false negatives. This helped in identifying misclassifications between similar disease types (e.g., Downey-Mildew vs. Anthracnose).
    • Classification Report: Precision, recall, and F1-score were computed for each class. This revealed the model’s ability to correctly identify minority classes and showed a balanced performance across all categories.
    • Visualization: Training vs. validation accuracy/loss graphs were plotted to monitor overfitting and convergence trends across epochs.

This comprehensive evaluation confirmed that the model was robust, with minimal underfitting or overfitting.

Fig. 5 shows that the model achieves high accuracy across all three classes—Anthracnose, Downey Mildew, and Healthy Leaf—with most predictions concentrated along the diagonal. Minor misclassifications occur between Healthy and Downey Mildew samples, indicating areas for improvement in future retraining.

Fig. 5.

Fig 5

Low Confusion matrix showing true vs. predicted labels.

Fig. 6 shows model accuracy improves steadily over epochs, reaching ∼87% on the training set and ∼74% on the validation set. The loss curves show a consistent decrease in training loss, with validation loss fluctuating after early epochs. This trend suggests the model learns key features effectively but exhibits mild overfitting, leaving scope for improvement with additional data or regularization.

Fig. 6.

Fig 6

Training and validation accuracy and loss curves across epochs.

  • System Stability and Responsiveness: A 24-hour stress test demonstrated the stability of both the hardware and software components. Real-time sensor data updates and AI inference results were consistently delivered with minimal latency (under 2 s for image analysis).

  • Automated Control Performance: The automated nutrient dosing and water level maintenance system successfully maintained target pH, TDS, and water levels within the hydroponic system, demonstrating the effectiveness of the closed-loop control logic.

Software and framework versions

Table 3, Table 4

Table 3.

Classification report for the test set.

Class Precision Recall F1-Score Support
Healthy-Leaf 0.87 0.75 0.80 100
Anthracnose 0.79 0.79 0.79 100
Downy Mildew 0.76 0.89 0.82 100
Overall / Avg 0.81 0.81 0.81 300

Table 4.

Software Versions and Libraries.

Component Library Framework Version
AI Model Tensorflow 2.19.0
API Server FastAPI 0.115.12
Node.js Backend Express 4.19.2
Frontend Dashboard NextJS 14.2
ESP32 C6 Arduine Core Framework 3.2.0

Limitations

While the proposed system demonstrates strong performance in spinach cultivation, several limitations should be acknowledged. First, the CNN model was trained on a relatively small dataset, which may limit robustness under diverse lighting, leaf orientations, or background conditions. Expanding the dataset and applying data augmentation or transfer learning could improve generalization. Second, the system was tested under controlled hydroponic conditions; in real-world deployments, environmental variations such as fluctuating temperature, humidity, or sensor calibration drift may affect accuracy and stability. Third, scalability remains a challenge–while suitable for small-scale setups, deploying this system in larger farms will require optimizing communication throughput, backend processing, and automated nutrient delivery at scale. Future work will address these limitations by incorporating larger and more diverse datasets, experimenting with advanced deep learning architectures, and validating the system across different crop types and environmental conditions. Additionally, integrating edge computing for faster on-site inference and developing adaptive control algorithms for nutrient and water delivery will enhance both scalability and resilience.

  • Processing Constraints: The ESP32 microcontroller has limited processing power and memory, restricting complex on-device analytics. Complex AI tasks are offloaded to an external server.

  • Model Generalization: The initial AI models are trained on spinach datasets and may require retraining for other crops. They are not immediately generalizable.

  • Network Reliability: Intermittent internet connectivity can delay real-time updates and disrupt pump-control commands. The system depends on a stable network connection.

  • Sensor Drift: The accuracy of sensors may degrade over time without periodic recalibration. Long-term accuracy is not guaranteed without maintenance.

Generalizability of the Method

Although validated on spinach, the proposed system is designed to be adaptable to other crops with minimal modifications. For plant disease detection, this primarily involves retraining the CNN model with crop-specific datasets and adjusting preprocessing pipelines for different leaf structures or imaging conditions. On the hardware side, additional sensors may be required depending on the crop’s nutrient or environmental needs (e.g., CO2 sensors for fruiting plants or advanced pH control for sensitive crops). Actuator thresholds such as pump timing, nutrient concentration, or lighting schedules can also be reconfigured in software to suit different growth environments. Furthermore, the modular architecture [21] allows the backend and frontend to remain largely unchanged, enabling straightforward extension of the framework to a wide range of crops and cultivation settings.

Ethics statements

This work did not involve human subjects, animal experiments, or data collected from social media platforms. Therefore, the ethical considerations related to these areas are not applicable.

CRediT authorship contribution statement

Prof. Trupti Baraskar: Conceptualization, Supervision, Project administration, Validation, Writing – review & editing, Resources. Viren Khatri: Methodology, Software, Validation, Visualization, Resources. Parimal Kolhe: Methodology, Software, Validation, Formal analysis, Investigation. Mitheelesh Katyarmal: Methodology, Software, Validation, Formal analysis, Investigation, Data curation, Writing – original draft. Shaunak Khedekar: Methodology, Software, Validation, Resources, Data curation, Writing – original draft.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Footnotes

Direct Submission or Co-Submission: Co-submissions are papers that have been submitted alongside an original research paper accepted for publication by another Elsevier journal

Data availability

Data will be made available on request.

References

  • 1.Rajendran Sasireka, Domalachenpa Tenzing, Arora Himanshu, Li Pai, Sharma Abhishek, Rajauria Gaurav. Hydroponics: exploring innovative sustainable technologies and applications across crop production, with emphasis on potato mini-tuber cultivation. Heliyon. 2024;10(5) doi: 10.1016/j.heliyon.2024.e26823. ISSN 2405-8440. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Sardare Mamta. A review on plant without soil - hydroponics. Int J Res Eng Technol. 2013;02:299–304. doi: 10.15623/ijret.2013.0203013. [DOI] [Google Scholar]
  • 3.Rajaseger G., Chan K.L., Yee Tan K., Ramasamy S., Khin M.C., Amaladoss A., Kadamb Haribhai P. Hydroponics: current trends in sustainable crop production. Bioinformation. 2023;19(9):925–938. doi: 10.6026/97320630019925. Sep 30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Khare P., Bhat R.S., Kulkarni P.S. Proc. 2nd Int. Conf. Emerg. Trends Eng. (ICETE) 2023. IoT-based AI controller and mobile app for solar-smart hydroponics; pp. 123–130. [Google Scholar]
  • 5.L.S. Kondaka, R. Iyer, S. Jaiswal and A. Ali, "A smart hydroponic farming system using machine learning," 2023 International Conference on Intelligent and Innovative Technologies in Computing, Electrical and Electronics (IITCEE), Bengaluru, India, 2023, pp. 357–362, doi: 10.1109/IITCEE57236.2023.10090860.
  • 6.Rahman M.A., Chakarborty N.R., Sufiun A., Banshal S.K., Tajnin F.R. An AIoT-based hydroponic system for crop recommendation and nutrient parameter monitorization. Smart Agric. Technol. 2024;8:100472. doi: 10.1016/j.atech.2024.100472. [DOI] [Google Scholar]
  • 7.Mehra M., Saxena S., Sankaranarayanan S., Tom R.J., Veeramanikandan M. IoT based hydroponics system using Deep Neural Networks. Comput. Electron. Agric. 2018;155:473–486. Dec. [Google Scholar]
  • 8.Aurasopon Apinan, Thongleam Thawatchai, Kuankid Sanya. Integration of IoT technology in hydroponic systems for enhanced efficiency and productivity in small-scale farming. Acta Technologica Agriculturae. 2024;27:203–211. doi: 10.2478/ata-2024-0027. [DOI] [Google Scholar]
  • 9.Abdullah M.S.T., Mazalan L. Smart automation aquaponics monitoring system. JOIV: Int. J. Informatics Vis. 2022;6(1–2):45–50. doi: 10.30630/joiv.6.1-2.925. [DOI] [Google Scholar]
  • 10.T. Akter, T. Mahmud, R. Chakma, N. Datta, M.S. Hossain and K. Andersson, "Smart monitoring and control of hydroponic systems using IoT solutions," 2024 S International Conference on Inventive Computing and Informatics (ICICI), Bangalore, India, 2024, pp. 761–767, doi: 10.1109/ICICI62254.2024.00128.
  • 11.Saraswathy V.R., Nithiesh C., Palani Kumaravel S., Ruphasri S. Integrating intelligence in hydroponic farms. International Journal of Electrical Engineering and Technology. 2020;11(4):150–158. doi: 10.34218/IJEET.11.4.2020.017. [DOI] [Google Scholar]
  • 12.R. Prasad, B. K.S., P. P., S. R. and R. Sudharshana K, "AI based Smart hydroponics system," 2024 IEEE International Conference on Information Technology, Electronics and Intelligent Communication Systems (ICITEICS), Bangalore, India, 2024, pp. 1–5, doi: 10.1109/ICITEICS61368.2024.10625374.
  • 13.Rao P.B.S., Venkatesh R.T., Murthy S.G., Negavadi S.B., Srinivasulu T., Nataraj Y.B., Satish V., Balenahalli V.K. Automated IoT solutions for efficient hydroponic farming: nutrients, PH and lighting management. Journal Européen des Systèmes Automatisés. 2024;57(5):1273–1283. doi: 10.18280/jesa.570503. [DOI] [Google Scholar]
  • 14.Gupta M.L., Roy P.K. Machine learning-based crop growth management in greenhouse environment using hydroponics farming techniques. J. Agric. Food Res. 2023;9 doi: 10.1016/j.jafr.2023.100001. [DOI] [Google Scholar]
  • 15.Tatas Konstantinos, Al-Zoubi Ahmad, Christofides Nicholas, Zannettis Chrysostomos, Chrysostomou Michael, Panteli Stavros, Antoniou Anthony. Reliable IoT-based monitoring and control of hydroponic systems. Technologies. 2022;10(1):26. doi: 10.3390/technologies10010026. [DOI] [Google Scholar]
  • 16.Idoje G., Mouroutoglou C., Dagiuklas T., Kotsiras A., Muddesar I., Alefragkis P. Comparative analysis of data using machine learning algorithms: a hydroponics system use case. Smart Agric. Technol. 2023;4(100207) doi: 10.1016/j.atech.2023.100207. Art[Online]Available. [DOI] [Google Scholar]
  • 17.Ferentinos K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018;145:311–318. doi: 10.1016/j.compag.2018.01.009. [Online]Available. [DOI] [Google Scholar]
  • 18.M.F. Younis and Z.S. Alwan, "Monitoring the performance of cloud real-time databases: a firebase case study," 2023 Al-Sadiq International Conference on Communication and Information Technology (AICCIT), Al-Muthana, Iraq, 2023, pp. 240–245, doi: 10.1109/AICCIT57614.2023.10217953.
  • 19.Bayılmış C., Ebleme M.A., Çavuşoğlu Ü., Küçük K., Sevin A. A survey on communication protocols and performance evaluations for Internet of Things. Digit. Commun. Netw. 2022;8(6):1094–1104. doi: 10.1016/j.dcan.2022.03.013. [DOI] [Google Scholar]
  • 20.Lakshmanan S., et al. Automated smart hydroponics system using internet of things. Int. J. Electr. Comput. Eng. (IJECE) Dec. 2020;10(6):6389–6398. doi: 10.11591/ijece.v10i6.pp6389-6398. [DOI] [Google Scholar]
  • 21.M. Aranda et al., "Modular IoT-based Automated Hydroponic System," 2021 International Conference on Mechatronics, Electronics and Automotive Engineering (ICMEAE), Cuernavaca, Mexico, 2021, pp. 220–226, doi: 10.1109/ICMEAE55138.2021.00042.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data will be made available on request.


Articles from MethodsX are provided here courtesy of Elsevier

RESOURCES