research article yield prediction for tomato greenhouse using...

10
Hindawi Publishing Corporation ISRN Artificial Intelligence Volume 2013, Article ID 430986, 9 pages http://dx.doi.org/10.1155/2013/430986 Research Article Yield Prediction for Tomato Greenhouse Using EFuNN Kefaya Qaddoum, E. L. Hines, and D. D. Iliescu School of Engineering, University of Warwick, Coventry CV4 7AL, UK Correspondence should be addressed to E. L. Hines; [email protected] Received 9 January 2013; Accepted 28 January 2013 Academic Editors: C. Kotropoulos, H. Ling, and L. S. Wang Copyright © 2013 Kefaya Qaddoum et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In the area of greenhouse operation, yield prediction still relies heavily on human expertise. is paper proposes an automatic tomato yield predictor to assist the human operators in anticipating more effectively weekly fluctuations and avoid problems of both overdemand and overproduction if the yield cannot be predicted accurately. e parameters used by the predictor consist of environmental variables inside the greenhouse, namely, temperature, CO 2 , vapour pressure deficit (VPD), and radiation, as well as past yield. Greenhouse environment data and crop records from a large scale commercial operation, Wight Salads Group (WSG) in the Isle of Wight, United Kingdom, collected during the period 2004 to 2008, were used to model tomato yield using an Intelligent System called “Evolving Fuzzy Neural Network” (EFuNN). Our results show that the EFuNN model predicted weekly fluctuations of the yield with an average accuracy of 90%. e contribution suggests that the multiple EFUNNs can be mapped to respective task-oriented rule-sets giving rise to adaptive knowledge bases that could assist growers in the control of tomato supplies and more generally could inform the decision making concerning overall crop management practices. 1. Introduction Greenhouse production systems require implementing computer-based climate control systems, including carbon dioxide (CO 2 ) supplementation. e sort of systems we are concerned with here are normally in use all year-round so as to maximize product and thus are typically applied in scenarios where the greenhouse crops have a long growing cycle. e technological advances and the sophistication of greenhouse crop production control systems do not mean that greenhouse operation does not rely on human expertise to decide on the optimum values for yield weekly amount. Practiced greenhouse tomato growers and researchers evaluate plant responses and growth mode by observations of the plant morphology. Tomato growers use this information in decision making depending on climate conditions and crop management practices to shiſt the plant growth toward a “balanced” growth mode, or to be able to accurately predict regular crops amounts each year. One of the dynamic and complex systems is tomato crop growth, and few models have studied it previously. Two of the dynamic growth models are TOMGRO [1, 2] and TOMSIM [3, 4]. Both models depend on physiological processes, and they model biomass dividing, crop growth, and yield as a function of several climate and physiological parameters. eir use is limited, especially for practical appli- cation by growers, by their complexity, and by the difficulty in obtaining the initial condition parameters required for implementation [3]. Moreover, critical measurements are required for calibration and validation for each application. Tompousse [5] predicts yields in terms of the weight of harvested fruits. eir model was developed in France for heated greenhouses and required that the linear relationships of both flowering rate and fruit growth period were in an appropriately warm environment; when the system was implemented in unheated plastic greenhouses, in Portugal, for example, the model performed poorly and was only tested for short production cycles of less than 15 weeks. Adams [6] proposed a greenhouse tomato model and implemented it in the form of a graphical simulation tool (HIPPO). A key objective of the model was to explain the weekly fluctuations of greenhouse tomato yields as charac- terized by fruit size and harvest rates. is model required hourly climate data in order to determine the rates of growth of leaf truss and flower production.

Upload: others

Post on 13-Oct-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

Hindawi Publishing CorporationISRN Artificial IntelligenceVolume 2013 Article ID 430986 9 pageshttpdxdoiorg1011552013430986

Research ArticleYield Prediction for Tomato Greenhouse Using EFuNN

Kefaya Qaddoum E L Hines and D D Iliescu

School of Engineering University of Warwick Coventry CV4 7AL UK

Correspondence should be addressed to E L Hines elhineswarwickacuk

Received 9 January 2013 Accepted 28 January 2013

Academic Editors C Kotropoulos H Ling and L S Wang

Copyright copy 2013 Kefaya Qaddoum et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

In the area of greenhouse operation yield prediction still relies heavily on human expertise This paper proposes an automatictomato yield predictor to assist the human operators in anticipating more effectively weekly fluctuations and avoid problems ofboth overdemand and overproduction if the yield cannot be predicted accurately The parameters used by the predictor consist ofenvironmental variables inside the greenhouse namely temperature CO

2 vapour pressure deficit (VPD) and radiation as well as

past yield Greenhouse environment data and crop records from a large scale commercial operationWight Salads Group (WSG) inthe Isle of Wight United Kingdom collected during the period 2004 to 2008 were used to model tomato yield using an IntelligentSystem called ldquoEvolving Fuzzy Neural Networkrdquo (EFuNN) Our results show that the EFuNNmodel predicted weekly fluctuationsof the yield with an average accuracy of 90 The contribution suggests that the multiple EFUNNs can be mapped to respectivetask-oriented rule-sets giving rise to adaptive knowledge bases that could assist growers in the control of tomato supplies and moregenerally could inform the decision making concerning overall crop management practices

1 Introduction

Greenhouse production systems require implementingcomputer-based climate control systems including carbondioxide (CO

2) supplementation The sort of systems we are

concerned with here are normally in use all year-round soas to maximize product and thus are typically applied inscenarios where the greenhouse crops have a long growingcycle The technological advances and the sophisticationof greenhouse crop production control systems do notmean that greenhouse operation does not rely on humanexpertise to decide on the optimum values for yieldweekly amount Practiced greenhouse tomato growers andresearchers evaluate plant responses and growth mode byobservations of the plant morphology Tomato growers usethis information in decision making depending on climateconditions and crop management practices to shift the plantgrowth toward a ldquobalancedrdquo growth mode or to be able toaccurately predict regular crops amounts each year

One of the dynamic and complex systems is tomatocrop growth and few models have studied it previouslyTwo of the dynamic growth models are TOMGRO [1 2]and TOMSIM [3 4] Both models depend on physiological

processes and they model biomass dividing crop growthand yield as a function of several climate and physiologicalparametersTheir use is limited especially for practical appli-cation by growers by their complexity and by the difficultyin obtaining the initial condition parameters required forimplementation [3] Moreover critical measurements arerequired for calibration and validation for each application

Tompousse [5] predicts yields in terms of the weight ofharvested fruits Their model was developed in France forheated greenhouses and required that the linear relationshipsof both flowering rate and fruit growth period were inan appropriately warm environment when the system wasimplemented in unheated plastic greenhouses in Portugalfor example themodel performed poorly and was only testedfor short production cycles of less than 15 weeks

Adams [6] proposed a greenhouse tomato model andimplemented it in the form of a graphical simulation tool(HIPPO) A key objective of the model was to explain theweekly fluctuations of greenhouse tomato yields as charac-terized by fruit size and harvest rates This model requiredhourly climate data in order to determine the rates of growthof leaf truss and flower production

2 ISRN Artificial Intelligence

Although the seasonal fluctuations of yield in greenhousecrops is generally understood to be influenced by the periodicvariation of solar radiation and air temperature greenhousegrowers are also interested in the short- and long-termfluctuations of yield There are a number of useful tools thatcan help growers when they are making short- and long-term decisions For example there are crop models thatpredict yield rates and produce quality in defining climatecontrol strategies in synchronizing crop production withmarket demands in handling the labour force in emergingmarketing strategies and in maintaining a consistent year-round produce quality

As we will show EFuNN offers the advantage that it isable to model nonlinear system relationships and has beenshown in other applications to be very robust when applied todata which is relatively imprecise incomplete and uncertainEFuNN has been successfully applied in applications suchas forecasting control optimization and pattern recognition[7] Intelligence is added to the process by computing thedegree of uncertainty and computing with linguistic terms(fuzzy variables) More accuracy is obtained compared tomechanistic models

Numerous studies have applied either neural networks(NNs) or fuzzy logic in greenhouse production systemsHowever most of them have focused on modeling the airtemperature in greenhouse environments [8ndash11] or optimalcontrol of CO

2with NN Recent techniques have included

modeling the greenhouse environment with hierarchicalfuzzy modeling [12] or controlling the environment withoptimized fuzzy control [13 14] Other studies concerningplant modeling have been reported [15] implemented ahybrid neurofuzzy approach in terms of the system identi-fication and modeling of the total dry weight yield of tomatoand lettuce [16] and developed a fuzzy model to predictnet photosynthesis of tomato crop canopies and the resultsobtained correlated well with the results TOMGRO [1] wasused to model the prediction processes

The objective of this study is to investigate how an IStechnique such as EFuNN performs when applied to currentcrop and climate records from greenhouse growers weeklyprediction of greenhouse tomato yield from environmentaland crop-related variables Yield was characterized by yieldper unit area (Yield kgm2)

The rest of this paper is organized as follows In Section 2we discuss the methodology introduction and materials andmethods Section 3 is results and finally Section 4 presentsthe conclusions

2 Background to Rule Extraction

A wide variety of methods are now available recentlyreviewed in [1 17 18] Reference [17] revisits the Andrewsclassification of rule extraction methods and emphasizesdistinction between decompositional and pedagogicalapproaches Rule extraction methods usually start by findinga minimal network in terms of number of hidden units andoverall connectivity The next simplification the key featureof the method is to cluster the hidden unit activations then

extract combinations of inputs which will activate eachhidden unit singly or together and thus the output generatesrules as the general form of rules shown as follows

If (1199091 le 1199051) AND sdot sdot sdotAND (119909119901 ge 119905119901)Then 119862119894 (1)

Taha and Ghosh [19] suggest binary inputs generatinga truth table from the inputs and simplifying the resultantBoolean function The growth of computational time withnumber of attributes makes minimizing the size of the neuralnetwork essential and somemethods evolve minimal topolo-gies The pedagogical approaches treat the neural network asa black box [20] and use the neural network only to generatetest data for the rule generation algorithm

21 Taxonomy of Rule Extraction Algorithms It is nowbecoming apparent that algorithms can be designed whichextract understandable representations from trained neuralnetworks enabling them to be used for decision-making Inthis section we use a taxonomy presented in [21] which usesthree criteria for classification of rule extraction algorithmsscope of use type of dependency with themethod of solutionof the type ldquoblack boxrdquo and format of the extracted rulesThealgorithms can be a regression or classification algorithmsThere are some algorithms that can be applied to both casessuch as theG-REX [21]On the second criterion an algorithmis considered independent if it is totally independent of themodel type black box used (such as ANN and Support VectorMachines)The algorithms that use information of the black-box methods are called dependent methods Regarding theformat of the extracted rules the methods can be classifiedinto descriptive and predictive The predictive algorithmsperform extraction of rules that allow the expert to makean easy prediction for each possible observation from inputspace If this analysis cannot be made directly the algorithmsare known only as descriptive

3 Materials and Methods

31 EFuNN Evolving Fuzzy Neural Networks andthe EFuNN Algorithm

311 Fuzzy Background Fuzzy inference systems (FISs) arevery useful for inference and handling uncertainty The basicmodels presented are [14] Some important issues that mustbe considered when building an FIS are identification ofstructure and estimation of parameters Efficient structureidentification optimizes the number of fuzzy rules and yieldsbetter convergence [22] Different membership functions(MFs) can be attached to the neurons (triangular Gaussianetc) The number of rules and the membership functionswere estimated by the designers in early implementationsof FIS [14] A more efficient structure is then employed tooptimize the number of rules in adaptive techniques whichare appropriate for learning parameters that change slowlyhandle complex systems with speedily changing characteris-tics considering the fact that it takes a long time after everyimportant change in the system to relearn model parameters[23 24]

ISRN Artificial Intelligence 3

Real valuesinput layer

Bias

Conditionelements

layer(119878119888 119886119888 119874119888)

Rulelayer

(119878119877 119886119877 119874119877)

Actionelements

layer(119878119860 119886119860 119874119860)

Real valuesoutputlayer

(119878119900 119886119900 119874119900)1198601

1198602

1198611

1198612

1198621

1198622

1198831

1198832

DI connections CF connectionsDI11

DI12 DI21

DI22

1198771

1198772

119862119886

CF1

CF2

Figure 1 FuNN architecture

References [14 25] describe some techniques to learnfuzzy structure and parameters In recent years several evolv-ing neurofuzzy systems (ENFSs) have been simulated andthese systems use online learning algorithms that can extractknowledge from data and perform a high-level adaptation ofthe network structure as well as learning of parameters

312TheGeneral Fuzzy Neural Network (FuNN) ArchitectureThe fuzzy neural network (FuNN) (Figure 1) is connectionistfeed-forward architecture with five layers of neurons and fourlayers of connections [26] The first layer of neurons receivesinput information The second layer calculates the fuzzymembership degrees of the input values which belong topredefined fuzzy membership functions for example smallmedium and large The third layer of neurons representsassociations between the input and the output variables fuzzyIf-Then rulesThe fourth layer calculates the degrees to whichoutput membership functions are matched by the input dataand the fifth layer does defuzzification and calculates valuesfor the output variables An FuNN has both the features ofa neural network and a fuzzy inference machine Severaltraining algorithms have been developed for FuNN [26] amodified back-propagation algorithm a genetic algorithmstructural learning with forgetting training and zeroingcombinedmodes Several algorithms for rule extraction fromFuNN have also been developed and applied Each of themrepresents each rule node of a trained FuNN as an If-Thenfuzzy rule

32 EFUNN Architecture EFuNNs are FuNN structures thatevolve according to the Evolving Connectionist Systems(ECOS) principles That is all nodes in an EFuNN are gen-erated through learningThenodes representingmembershipfunctions can bemodified during learning As in FuNN eachinput variable is represented here by a group of spatiallyarranged neurons to represent different fuzzy domain areasof this variable Different membership functions can beattached to these neurons (triangular Gaussian etc) Newneurons evolve in this layer if for a given input vector thecorresponding variable value does not belong to any of

the existing membership functions to a membership degreegreater than a membership threshold and this means thatnew fuzzy label neuron or an input variable neuron can becreated during the adaptation phase of an EFuNN

33 Yield Records and Climate Data Crop records and clim-ate data from a tomato greenhouse operation inWSG (WightSalads Group) in the Isle of Wight United Kingdom wereused to design and train evolving fuzzy neural networks foryield prediction We used fuzzy inference system for imp-lementing input parameter characterization The data in-cludes six datasets from two production cycles (S1 2004to 2007 and S2 2008) and one greenhouse section (NewSite) The total number of records is 1286 and each recordincluded 14 parameters characterizing the weekly greenhouseenvironment and the crop features

The environmental parameters which are to be controlledare the vapor pressure deficit (VPD) and the differentialtemperature between the daytime to nighttime (119879

119889minus 119879119899)

The setpoints for each environmental treatment were VPD =20 kPa 06 kPa and uncontrolled 119879

119889119879119899= 26∘C18∘C 20∘C

20∘C and 22∘C18∘C respectively for each compartmentThedate when they were planted and the period over which thecrops were allowed to grow in the case of both datasets aresummarized in Table 1

The tomatoes were grown in greenhouses on a high-wire system with hydroponic CO

2supplementation and

computer climateirrigation control The greenhouses wereequipped with hot-water heating pipes and roof vents forpassive cooling

The crop records consisted of 12 plant samples per green-house section that were randomly selected and continuouslymeasured during the production cycle The crop record datawere collected by direct observation and by manually mea-suring or counting each of the morphological features Thissystem included an electronic sensor unit which measuredthe air temperature humidity andCO

2concentration in each

of the greenhouse sections Outside weather conditions weredetermined via a weather station Daytime nighttime and24 h averages of daily climate data from outside and insideof the greenhouse section were also obtained by the growerweekly averages were computed to match the weekly croprecords

34 Modeling the Parameters Yield (119884119886 kgmminus2 weekminus1) is of

interest to greenhouse growers as a means via which theycan develop short-term crop management strategies it isalso useful for labour management strategies Greenhousetomato cumulative yield can be described either as freshweight (Cockshull 1988) or as dry mass [27] Both of thesestudies were performed in northern latitudes without CO

2

enrichment for short production periods (lt100 days) andthe plants were cultivated in soil not hydroponically Theseare not currently standard cultivation methods because mostof the greenhouse growers make use of high-technologyproduction facilities

Yield development is influencedmainly by fruit tempera-ture This parameter is inversely related to fruit development

4 ISRN Artificial Intelligence

Table 1 Summary of datasets Crop records include samples per week in New House Sites

Season (years) Greenhouse Transplanted on WOYa and date Crop durationb (weeks) CultivarCommercial DS

1 New h21 (26) June 24 2004 42 Campari(2004-2005) New h22 (28) July 8 2004 47 Campari

2 New h21 (31) July 28 2005 43 Campari(2004ndash2007) New h22 (25) June 16 2005 44 Campari

New h22 (33) Aug 11 2006 36 CherryNew h22 (41) Oct 6 2007 39 Cherry

a bWOY week of the year number

for each evolving layer neuron h doCreate a new rule rfor each input neuron i do

Find the condition neuron c with the largestweight119882

119888ℎ

Add an antecedent to r of the form ldquoi is c119882119888ℎrdquo where119882

119888119905is the confidence factor for

that antecedentend forfor each output neuron o do

Find the action neuron a with the largest weight119882ℎ119886

Add a consequent to r of the form ldquoo is a119882ℎ119886rdquo where119882

ℎ119886is the confidence factor for

that consequentend for

end for

Algorithm 1 EFuNN algorithm steps

rate and shows a linear relationship with air temperature[28 29] This relationship is shown in Algorithm 1 for allthe datasets which show the relation between yield and airtemperature (119879119894

11989924)

341 Data Processing The steps of the preprocessing includemaking average through a certain amount of the certainpoint of some data recordsWe preprocessed 5 environmentalvariables (CO

2 temperature vapor pressure deficit (VPD)

yield and radiation) for different tomato cultivars fromdifferent greenhouses inWSG area which were not ready forprocessing but had to be pre-processed For instance somevalues in some tomato records were missing and we had toreplace them with 0 being not ready to be fed to an artificialneural network for processing Thus three preprocessingsteps were taken

(1) Edit each data file and group same tomato cultivar andtype in one filewith all environmental variables of thatcultivar gathered from different greenhouses

(2) For each cultivar store values in a Microsoft Excelspreadsheet Next in the spreadsheet 0 replaced nullvalues For some VPD and radiation missing valuesare averaged

(3) The Excel spreadsheet content was converted todatfile input to be fed into Matlab and EFUNN applica-tion

35 Neural NetworkModel ComputationalNNs have provento be a powerful tool to solve several types of problemsin various real life fields where approximation of nonlinearfunctions classification identification and pattern recogni-tion are required NNs are mathematical representations ofbiological neurons in the way that they process informationas parallel computing units In general there are two typesof neural network architecture (1) static (feedforward) inwhich no feedback or time delays exist and (2) dynamicneural networks whose outputs depend on the current orprevious inputs outputs or states of the network (Demuthet al [30])

We consider a neural network that consists of an inputlayer with 119899thorn1 nodes a hidden layer with h units and anoutput layer with l units as follows

119910119894= 119892(

sum119895=1

119908119894119895119891(

119899+1

sum119896=1

V119895119896119883119896)) 119894 = 1 2 119897 (2)

where xk indicates the kth input value 119910119894 the 119894th outputvalue vjk a weight connecting the kth input node with

ISRN Artificial Intelligence 5

Fuzzification

Inputs

Rule nodes Defuzzification

Outputs

Figure 2 Simple EFuNN structure

the jth hidden unit and wij a weight between the jthhidden unit and the 119894th output unit We start learning withone hidden layer which would be updating the weightsand thresholds to minimize the objective function by anyoptimization algorithm then terminate the network trainingwith this number of hidden units when a local minimumof the objective function has been reached If the desiredaccuracy is not reached increase the number of hidden unitswith random weights and thresholds and repeat updatingotherwise finish training and stop

36 The EFuNN Algorithm EFuNN consists of a five-layerstructure which begins with the input layer representinginput variables and each input variable is a presentation ofgroup of arranged neurons representing a fuzzy quantizationof this variable which is then presented to the secondlayer of nodes Different membership functions (MFs) canbe attached to these neurons (triangular Gaussian etc)where continuing modifications of nodes with a membershipfunctions can be applied during learning

Through node creation and consecutive aggregation anEFuNN system can adjust over time to changes in the datastream and at the same time preserve its generalizationcapabilities If EFuNNs use linear equations to calculatethe activation of the rule nodes (instead of using Gaussianfunctions and exponential functions) the EFuNN learningprocedure is faster EFuNN also produces a better onlinegeneralization which is a result of more accurate nodeallocation during the learning process As Algorithm 1 showsEFuNN allows for continuous training on new data furthertesting and also training of the EFuNN on the test data inan online mode leading to a significant improvement of theaccuracy

The third layer contains rule nodes that evolve throughhybrid-supervised learning These rule nodes present proto-types of input-output data associations where each rule nodeis defined by two vectors of connection weights 1198821(119903) and1198822(119903) also W2 is adjusted through learning depending onoutput error andW1 is modified based on similarity measurewithin input space in the local area

The fourth layer of neurons represents fuzzy quantifica-tion of the output variables similar to the input fuzzy neuronsrepresentation The fifth layer represents the real values forthe output variables In the case of a ldquoone-of-nrdquomode EFuNN

transmits the maximum activation of the rule to the nextlevel In the case of a ldquomany-of-nrdquomode the activation valuesofm (119898 gt 1) rule nodes that are above an activation thresholdare transmitted further in the connectionist structure

In this paper EFuNNrsquos Figure 2 shows the evolvingalgorithm is based on the principle that rule nodes only existif they are needed As each training example is presented theactivation values of the nodes the rule and action layers andthe error over the action nodes are examined if themaximumrule node activation is below a set threshold then a rule nodeis added If the action node error is above a threshold value arule node is added Finally if the radius of the updated nodeis larger than a radius threshold then the updating process isundone and a rule node is added

EFuNN has several parameters to be optimized and theyare

(1) number of input and output(2) learning rate forW1 andW2(3) pruning control(4) aggregation control(5) number of membership functions(6) shape of membership functions(7) initial sensitivity threshold(8) maximum radius(9) M-of-n value

37 InputOutput Parameters The architecture and trainingmode of EFuNNdepends on the input and output parametersas determined by the problem being solved The tomato cropdata includes biological entities that show nonlinear dynamicbehaviour and whose response depends not only on severalenvironmental factors but also on the current and previouscrop conditions Greenhouse tomatoes have long productioncycles and the total yield is determined by these parametersThe effects of several environmental factors (light CO

2 air

humidity and air temperature) have both short- and long-term impacts on tomato plants Air temperature directlyaffects fruit growth and according to [28] it is the influentialparameter on the growth process Figure 3

Greenhouse tomatoes have a variable fruit growth periodranging from 40 to 67 days This deviation results from thechanging 24-hour average air temperature in the greenhouseHere the objective was to design an EFuNN that is simpleenough and accurate enough to predict the variables ofinterest since for many practical problems variables have

119863119899=(12) (sum

119888

119894=1

1003816100381610038161003816119868119894 minus1198821198941198991003816100381610038161003816)

sum119888

119894=1119882119894119899

(3)

different levels of importance and make different contribu-tions to the output Also it is necessary to find an optimalnormalization and assign proper importance factors to thevariables reduce the size of input vectors and keep only themost important variablesThis dynamic network architecturewas chosen because of its memory association and learning

6 ISRN Artificial Intelligence

0 5 10 15 20 25

12

1

08

06

04

02

0

Yiel

d (k

g)

Temperature

Figure 3 24 h average air temperature relationshipwith yield grownin tomatoes greenhouse

capability with sequential and time-varying patterns whichis most likely the biological situation for tomato plants

4 Experiments Setup Training andPerformance Evaluation

The training process implies iterative adjustment of the biasesof the network by minimizing a performance function whenpresenting input and target vectors to the networkThemeansquare error (MSE) selected as the performance function(Table 2) was calculated as the difference between the targetoutput and the network output

The supervised learning in EFuNN is built on thepreviously explained principles so that when a new dataexample is presented the EFuNN creates a new rule nodeto memorize the two input and output fuzzy vectors andoradjusts the winning rule node After a certain number ofexamples are applied some neurons and connections may bepruned or aggregated and only the best are chosen Differentpruning rules can be applied in order to realize the mostsuccessful pruning of unnecessary nodes and connectionsAlthough there are several options for growing the EFuNNwe restricted the learning algorithm to the 1-of-n algorithm

Using Gaussian functions and exponential functions theEFuNN learning procedure produces a better online gener-alization which is a result of more accurate node allocationduring the learning process EFuNN allowed continuoustraining on new data we did further testing and also trainingof the EFuNN on the test data in an online mode whichled to a significant improvement in accuracy A significantadvantage of EFuNNs is the local learning which allowed afast adaptive learning where only few connections and nodesare changed or created after the entry of each new data itemThis is in contrast to the global learning algorithmswhere foreach input vector all connection weights changed

Tomato plants were included in the training testingand validation process The initial datasets were randomlydivided so that 60 (771) of the records were assigned to thetraining set 20 (257) to the validation set and 20 (257)to the test set The generalization in each of the networks

Table 2 Test results and performance comparison of demandforecasting

EFuNN ANN(Multilayer perceptron)

Learning epochs 1 2500Training error(RMSE) 00013 0116

Testing error(RMSE) 00092 0118

Computational load(in billion flops) 0536 872

Table 3 Results from 4 runs of the EFuNN

RMS (training) No of rule nodes Accuracy ()00045 77 829500023 48 860000029 75 849100028 81 8079

was improved by implementing the early stopping methodthrough the validation set After training each of the networksto a satisfactory performance the independent validationset was used to evaluate the prediction performance and topresent the results

We initialized 119909(119905) 119909(119905 minus 1) 119909(119905 minus 2) 119909(119905 minus 119899) in orderto predict the 119909(119905 + 1) and the training was replicated threetimes using three different samples of training data anddifferent combinations of network parameters We used fourGaussianMFs for each input variable as well as the followingevolving parameters number of membership functions = 3sensitivity threshold 119878119905ℎ119903 = 099 learning rate = 025 errorthreshold 119864119903119903119905ℎ119903 = 0001 learning rates for first and secondlayer = 005 EFuNN uses a one-pass training approachThe network parameters were determined using a trial anderror approach The training was repeated three times afterreinitializing the network and the worst errors were reportedin Figure 5 Online learning in EFuNN resulted in creating2122 rule nodes as depicted in Figure 6 We illustrate theEFuNN training results and the training performance issummarized in Table 3

An investigation of the extracted rule set shown inFigure 4 from run 2 indicates that different compartmentsfrom the mentioned environmental variables affected yieldprediction Rules can be obtained from three different kindsof sources human experts numerical data and neural net-works All the obtained rules can be used in rule selectionmethod to obtain a smaller linguistic rule-based systemwith a higher performance We explain the fuzzy arithmetic-based [31] approach to linguistic rule extraction from trainedEFuNN for modeling problems using some computer simu-lations Assume that our training output was five rules for thenonlinear function realized by the trained neural network weassume that the five terms (rules) are given for each of the twoinput variables We also assume that the same five terms aregiven for the output variable

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

2 ISRN Artificial Intelligence

Although the seasonal fluctuations of yield in greenhousecrops is generally understood to be influenced by the periodicvariation of solar radiation and air temperature greenhousegrowers are also interested in the short- and long-termfluctuations of yield There are a number of useful tools thatcan help growers when they are making short- and long-term decisions For example there are crop models thatpredict yield rates and produce quality in defining climatecontrol strategies in synchronizing crop production withmarket demands in handling the labour force in emergingmarketing strategies and in maintaining a consistent year-round produce quality

As we will show EFuNN offers the advantage that it isable to model nonlinear system relationships and has beenshown in other applications to be very robust when applied todata which is relatively imprecise incomplete and uncertainEFuNN has been successfully applied in applications suchas forecasting control optimization and pattern recognition[7] Intelligence is added to the process by computing thedegree of uncertainty and computing with linguistic terms(fuzzy variables) More accuracy is obtained compared tomechanistic models

Numerous studies have applied either neural networks(NNs) or fuzzy logic in greenhouse production systemsHowever most of them have focused on modeling the airtemperature in greenhouse environments [8ndash11] or optimalcontrol of CO

2with NN Recent techniques have included

modeling the greenhouse environment with hierarchicalfuzzy modeling [12] or controlling the environment withoptimized fuzzy control [13 14] Other studies concerningplant modeling have been reported [15] implemented ahybrid neurofuzzy approach in terms of the system identi-fication and modeling of the total dry weight yield of tomatoand lettuce [16] and developed a fuzzy model to predictnet photosynthesis of tomato crop canopies and the resultsobtained correlated well with the results TOMGRO [1] wasused to model the prediction processes

The objective of this study is to investigate how an IStechnique such as EFuNN performs when applied to currentcrop and climate records from greenhouse growers weeklyprediction of greenhouse tomato yield from environmentaland crop-related variables Yield was characterized by yieldper unit area (Yield kgm2)

The rest of this paper is organized as follows In Section 2we discuss the methodology introduction and materials andmethods Section 3 is results and finally Section 4 presentsthe conclusions

2 Background to Rule Extraction

A wide variety of methods are now available recentlyreviewed in [1 17 18] Reference [17] revisits the Andrewsclassification of rule extraction methods and emphasizesdistinction between decompositional and pedagogicalapproaches Rule extraction methods usually start by findinga minimal network in terms of number of hidden units andoverall connectivity The next simplification the key featureof the method is to cluster the hidden unit activations then

extract combinations of inputs which will activate eachhidden unit singly or together and thus the output generatesrules as the general form of rules shown as follows

If (1199091 le 1199051) AND sdot sdot sdotAND (119909119901 ge 119905119901)Then 119862119894 (1)

Taha and Ghosh [19] suggest binary inputs generatinga truth table from the inputs and simplifying the resultantBoolean function The growth of computational time withnumber of attributes makes minimizing the size of the neuralnetwork essential and somemethods evolve minimal topolo-gies The pedagogical approaches treat the neural network asa black box [20] and use the neural network only to generatetest data for the rule generation algorithm

21 Taxonomy of Rule Extraction Algorithms It is nowbecoming apparent that algorithms can be designed whichextract understandable representations from trained neuralnetworks enabling them to be used for decision-making Inthis section we use a taxonomy presented in [21] which usesthree criteria for classification of rule extraction algorithmsscope of use type of dependency with themethod of solutionof the type ldquoblack boxrdquo and format of the extracted rulesThealgorithms can be a regression or classification algorithmsThere are some algorithms that can be applied to both casessuch as theG-REX [21]On the second criterion an algorithmis considered independent if it is totally independent of themodel type black box used (such as ANN and Support VectorMachines)The algorithms that use information of the black-box methods are called dependent methods Regarding theformat of the extracted rules the methods can be classifiedinto descriptive and predictive The predictive algorithmsperform extraction of rules that allow the expert to makean easy prediction for each possible observation from inputspace If this analysis cannot be made directly the algorithmsare known only as descriptive

3 Materials and Methods

31 EFuNN Evolving Fuzzy Neural Networks andthe EFuNN Algorithm

311 Fuzzy Background Fuzzy inference systems (FISs) arevery useful for inference and handling uncertainty The basicmodels presented are [14] Some important issues that mustbe considered when building an FIS are identification ofstructure and estimation of parameters Efficient structureidentification optimizes the number of fuzzy rules and yieldsbetter convergence [22] Different membership functions(MFs) can be attached to the neurons (triangular Gaussianetc) The number of rules and the membership functionswere estimated by the designers in early implementationsof FIS [14] A more efficient structure is then employed tooptimize the number of rules in adaptive techniques whichare appropriate for learning parameters that change slowlyhandle complex systems with speedily changing characteris-tics considering the fact that it takes a long time after everyimportant change in the system to relearn model parameters[23 24]

ISRN Artificial Intelligence 3

Real valuesinput layer

Bias

Conditionelements

layer(119878119888 119886119888 119874119888)

Rulelayer

(119878119877 119886119877 119874119877)

Actionelements

layer(119878119860 119886119860 119874119860)

Real valuesoutputlayer

(119878119900 119886119900 119874119900)1198601

1198602

1198611

1198612

1198621

1198622

1198831

1198832

DI connections CF connectionsDI11

DI12 DI21

DI22

1198771

1198772

119862119886

CF1

CF2

Figure 1 FuNN architecture

References [14 25] describe some techniques to learnfuzzy structure and parameters In recent years several evolv-ing neurofuzzy systems (ENFSs) have been simulated andthese systems use online learning algorithms that can extractknowledge from data and perform a high-level adaptation ofthe network structure as well as learning of parameters

312TheGeneral Fuzzy Neural Network (FuNN) ArchitectureThe fuzzy neural network (FuNN) (Figure 1) is connectionistfeed-forward architecture with five layers of neurons and fourlayers of connections [26] The first layer of neurons receivesinput information The second layer calculates the fuzzymembership degrees of the input values which belong topredefined fuzzy membership functions for example smallmedium and large The third layer of neurons representsassociations between the input and the output variables fuzzyIf-Then rulesThe fourth layer calculates the degrees to whichoutput membership functions are matched by the input dataand the fifth layer does defuzzification and calculates valuesfor the output variables An FuNN has both the features ofa neural network and a fuzzy inference machine Severaltraining algorithms have been developed for FuNN [26] amodified back-propagation algorithm a genetic algorithmstructural learning with forgetting training and zeroingcombinedmodes Several algorithms for rule extraction fromFuNN have also been developed and applied Each of themrepresents each rule node of a trained FuNN as an If-Thenfuzzy rule

32 EFUNN Architecture EFuNNs are FuNN structures thatevolve according to the Evolving Connectionist Systems(ECOS) principles That is all nodes in an EFuNN are gen-erated through learningThenodes representingmembershipfunctions can bemodified during learning As in FuNN eachinput variable is represented here by a group of spatiallyarranged neurons to represent different fuzzy domain areasof this variable Different membership functions can beattached to these neurons (triangular Gaussian etc) Newneurons evolve in this layer if for a given input vector thecorresponding variable value does not belong to any of

the existing membership functions to a membership degreegreater than a membership threshold and this means thatnew fuzzy label neuron or an input variable neuron can becreated during the adaptation phase of an EFuNN

33 Yield Records and Climate Data Crop records and clim-ate data from a tomato greenhouse operation inWSG (WightSalads Group) in the Isle of Wight United Kingdom wereused to design and train evolving fuzzy neural networks foryield prediction We used fuzzy inference system for imp-lementing input parameter characterization The data in-cludes six datasets from two production cycles (S1 2004to 2007 and S2 2008) and one greenhouse section (NewSite) The total number of records is 1286 and each recordincluded 14 parameters characterizing the weekly greenhouseenvironment and the crop features

The environmental parameters which are to be controlledare the vapor pressure deficit (VPD) and the differentialtemperature between the daytime to nighttime (119879

119889minus 119879119899)

The setpoints for each environmental treatment were VPD =20 kPa 06 kPa and uncontrolled 119879

119889119879119899= 26∘C18∘C 20∘C

20∘C and 22∘C18∘C respectively for each compartmentThedate when they were planted and the period over which thecrops were allowed to grow in the case of both datasets aresummarized in Table 1

The tomatoes were grown in greenhouses on a high-wire system with hydroponic CO

2supplementation and

computer climateirrigation control The greenhouses wereequipped with hot-water heating pipes and roof vents forpassive cooling

The crop records consisted of 12 plant samples per green-house section that were randomly selected and continuouslymeasured during the production cycle The crop record datawere collected by direct observation and by manually mea-suring or counting each of the morphological features Thissystem included an electronic sensor unit which measuredthe air temperature humidity andCO

2concentration in each

of the greenhouse sections Outside weather conditions weredetermined via a weather station Daytime nighttime and24 h averages of daily climate data from outside and insideof the greenhouse section were also obtained by the growerweekly averages were computed to match the weekly croprecords

34 Modeling the Parameters Yield (119884119886 kgmminus2 weekminus1) is of

interest to greenhouse growers as a means via which theycan develop short-term crop management strategies it isalso useful for labour management strategies Greenhousetomato cumulative yield can be described either as freshweight (Cockshull 1988) or as dry mass [27] Both of thesestudies were performed in northern latitudes without CO

2

enrichment for short production periods (lt100 days) andthe plants were cultivated in soil not hydroponically Theseare not currently standard cultivation methods because mostof the greenhouse growers make use of high-technologyproduction facilities

Yield development is influencedmainly by fruit tempera-ture This parameter is inversely related to fruit development

4 ISRN Artificial Intelligence

Table 1 Summary of datasets Crop records include samples per week in New House Sites

Season (years) Greenhouse Transplanted on WOYa and date Crop durationb (weeks) CultivarCommercial DS

1 New h21 (26) June 24 2004 42 Campari(2004-2005) New h22 (28) July 8 2004 47 Campari

2 New h21 (31) July 28 2005 43 Campari(2004ndash2007) New h22 (25) June 16 2005 44 Campari

New h22 (33) Aug 11 2006 36 CherryNew h22 (41) Oct 6 2007 39 Cherry

a bWOY week of the year number

for each evolving layer neuron h doCreate a new rule rfor each input neuron i do

Find the condition neuron c with the largestweight119882

119888ℎ

Add an antecedent to r of the form ldquoi is c119882119888ℎrdquo where119882

119888119905is the confidence factor for

that antecedentend forfor each output neuron o do

Find the action neuron a with the largest weight119882ℎ119886

Add a consequent to r of the form ldquoo is a119882ℎ119886rdquo where119882

ℎ119886is the confidence factor for

that consequentend for

end for

Algorithm 1 EFuNN algorithm steps

rate and shows a linear relationship with air temperature[28 29] This relationship is shown in Algorithm 1 for allthe datasets which show the relation between yield and airtemperature (119879119894

11989924)

341 Data Processing The steps of the preprocessing includemaking average through a certain amount of the certainpoint of some data recordsWe preprocessed 5 environmentalvariables (CO

2 temperature vapor pressure deficit (VPD)

yield and radiation) for different tomato cultivars fromdifferent greenhouses inWSG area which were not ready forprocessing but had to be pre-processed For instance somevalues in some tomato records were missing and we had toreplace them with 0 being not ready to be fed to an artificialneural network for processing Thus three preprocessingsteps were taken

(1) Edit each data file and group same tomato cultivar andtype in one filewith all environmental variables of thatcultivar gathered from different greenhouses

(2) For each cultivar store values in a Microsoft Excelspreadsheet Next in the spreadsheet 0 replaced nullvalues For some VPD and radiation missing valuesare averaged

(3) The Excel spreadsheet content was converted todatfile input to be fed into Matlab and EFUNN applica-tion

35 Neural NetworkModel ComputationalNNs have provento be a powerful tool to solve several types of problemsin various real life fields where approximation of nonlinearfunctions classification identification and pattern recogni-tion are required NNs are mathematical representations ofbiological neurons in the way that they process informationas parallel computing units In general there are two typesof neural network architecture (1) static (feedforward) inwhich no feedback or time delays exist and (2) dynamicneural networks whose outputs depend on the current orprevious inputs outputs or states of the network (Demuthet al [30])

We consider a neural network that consists of an inputlayer with 119899thorn1 nodes a hidden layer with h units and anoutput layer with l units as follows

119910119894= 119892(

sum119895=1

119908119894119895119891(

119899+1

sum119896=1

V119895119896119883119896)) 119894 = 1 2 119897 (2)

where xk indicates the kth input value 119910119894 the 119894th outputvalue vjk a weight connecting the kth input node with

ISRN Artificial Intelligence 5

Fuzzification

Inputs

Rule nodes Defuzzification

Outputs

Figure 2 Simple EFuNN structure

the jth hidden unit and wij a weight between the jthhidden unit and the 119894th output unit We start learning withone hidden layer which would be updating the weightsand thresholds to minimize the objective function by anyoptimization algorithm then terminate the network trainingwith this number of hidden units when a local minimumof the objective function has been reached If the desiredaccuracy is not reached increase the number of hidden unitswith random weights and thresholds and repeat updatingotherwise finish training and stop

36 The EFuNN Algorithm EFuNN consists of a five-layerstructure which begins with the input layer representinginput variables and each input variable is a presentation ofgroup of arranged neurons representing a fuzzy quantizationof this variable which is then presented to the secondlayer of nodes Different membership functions (MFs) canbe attached to these neurons (triangular Gaussian etc)where continuing modifications of nodes with a membershipfunctions can be applied during learning

Through node creation and consecutive aggregation anEFuNN system can adjust over time to changes in the datastream and at the same time preserve its generalizationcapabilities If EFuNNs use linear equations to calculatethe activation of the rule nodes (instead of using Gaussianfunctions and exponential functions) the EFuNN learningprocedure is faster EFuNN also produces a better onlinegeneralization which is a result of more accurate nodeallocation during the learning process As Algorithm 1 showsEFuNN allows for continuous training on new data furthertesting and also training of the EFuNN on the test data inan online mode leading to a significant improvement of theaccuracy

The third layer contains rule nodes that evolve throughhybrid-supervised learning These rule nodes present proto-types of input-output data associations where each rule nodeis defined by two vectors of connection weights 1198821(119903) and1198822(119903) also W2 is adjusted through learning depending onoutput error andW1 is modified based on similarity measurewithin input space in the local area

The fourth layer of neurons represents fuzzy quantifica-tion of the output variables similar to the input fuzzy neuronsrepresentation The fifth layer represents the real values forthe output variables In the case of a ldquoone-of-nrdquomode EFuNN

transmits the maximum activation of the rule to the nextlevel In the case of a ldquomany-of-nrdquomode the activation valuesofm (119898 gt 1) rule nodes that are above an activation thresholdare transmitted further in the connectionist structure

In this paper EFuNNrsquos Figure 2 shows the evolvingalgorithm is based on the principle that rule nodes only existif they are needed As each training example is presented theactivation values of the nodes the rule and action layers andthe error over the action nodes are examined if themaximumrule node activation is below a set threshold then a rule nodeis added If the action node error is above a threshold value arule node is added Finally if the radius of the updated nodeis larger than a radius threshold then the updating process isundone and a rule node is added

EFuNN has several parameters to be optimized and theyare

(1) number of input and output(2) learning rate forW1 andW2(3) pruning control(4) aggregation control(5) number of membership functions(6) shape of membership functions(7) initial sensitivity threshold(8) maximum radius(9) M-of-n value

37 InputOutput Parameters The architecture and trainingmode of EFuNNdepends on the input and output parametersas determined by the problem being solved The tomato cropdata includes biological entities that show nonlinear dynamicbehaviour and whose response depends not only on severalenvironmental factors but also on the current and previouscrop conditions Greenhouse tomatoes have long productioncycles and the total yield is determined by these parametersThe effects of several environmental factors (light CO

2 air

humidity and air temperature) have both short- and long-term impacts on tomato plants Air temperature directlyaffects fruit growth and according to [28] it is the influentialparameter on the growth process Figure 3

Greenhouse tomatoes have a variable fruit growth periodranging from 40 to 67 days This deviation results from thechanging 24-hour average air temperature in the greenhouseHere the objective was to design an EFuNN that is simpleenough and accurate enough to predict the variables ofinterest since for many practical problems variables have

119863119899=(12) (sum

119888

119894=1

1003816100381610038161003816119868119894 minus1198821198941198991003816100381610038161003816)

sum119888

119894=1119882119894119899

(3)

different levels of importance and make different contribu-tions to the output Also it is necessary to find an optimalnormalization and assign proper importance factors to thevariables reduce the size of input vectors and keep only themost important variablesThis dynamic network architecturewas chosen because of its memory association and learning

6 ISRN Artificial Intelligence

0 5 10 15 20 25

12

1

08

06

04

02

0

Yiel

d (k

g)

Temperature

Figure 3 24 h average air temperature relationshipwith yield grownin tomatoes greenhouse

capability with sequential and time-varying patterns whichis most likely the biological situation for tomato plants

4 Experiments Setup Training andPerformance Evaluation

The training process implies iterative adjustment of the biasesof the network by minimizing a performance function whenpresenting input and target vectors to the networkThemeansquare error (MSE) selected as the performance function(Table 2) was calculated as the difference between the targetoutput and the network output

The supervised learning in EFuNN is built on thepreviously explained principles so that when a new dataexample is presented the EFuNN creates a new rule nodeto memorize the two input and output fuzzy vectors andoradjusts the winning rule node After a certain number ofexamples are applied some neurons and connections may bepruned or aggregated and only the best are chosen Differentpruning rules can be applied in order to realize the mostsuccessful pruning of unnecessary nodes and connectionsAlthough there are several options for growing the EFuNNwe restricted the learning algorithm to the 1-of-n algorithm

Using Gaussian functions and exponential functions theEFuNN learning procedure produces a better online gener-alization which is a result of more accurate node allocationduring the learning process EFuNN allowed continuoustraining on new data we did further testing and also trainingof the EFuNN on the test data in an online mode whichled to a significant improvement in accuracy A significantadvantage of EFuNNs is the local learning which allowed afast adaptive learning where only few connections and nodesare changed or created after the entry of each new data itemThis is in contrast to the global learning algorithmswhere foreach input vector all connection weights changed

Tomato plants were included in the training testingand validation process The initial datasets were randomlydivided so that 60 (771) of the records were assigned to thetraining set 20 (257) to the validation set and 20 (257)to the test set The generalization in each of the networks

Table 2 Test results and performance comparison of demandforecasting

EFuNN ANN(Multilayer perceptron)

Learning epochs 1 2500Training error(RMSE) 00013 0116

Testing error(RMSE) 00092 0118

Computational load(in billion flops) 0536 872

Table 3 Results from 4 runs of the EFuNN

RMS (training) No of rule nodes Accuracy ()00045 77 829500023 48 860000029 75 849100028 81 8079

was improved by implementing the early stopping methodthrough the validation set After training each of the networksto a satisfactory performance the independent validationset was used to evaluate the prediction performance and topresent the results

We initialized 119909(119905) 119909(119905 minus 1) 119909(119905 minus 2) 119909(119905 minus 119899) in orderto predict the 119909(119905 + 1) and the training was replicated threetimes using three different samples of training data anddifferent combinations of network parameters We used fourGaussianMFs for each input variable as well as the followingevolving parameters number of membership functions = 3sensitivity threshold 119878119905ℎ119903 = 099 learning rate = 025 errorthreshold 119864119903119903119905ℎ119903 = 0001 learning rates for first and secondlayer = 005 EFuNN uses a one-pass training approachThe network parameters were determined using a trial anderror approach The training was repeated three times afterreinitializing the network and the worst errors were reportedin Figure 5 Online learning in EFuNN resulted in creating2122 rule nodes as depicted in Figure 6 We illustrate theEFuNN training results and the training performance issummarized in Table 3

An investigation of the extracted rule set shown inFigure 4 from run 2 indicates that different compartmentsfrom the mentioned environmental variables affected yieldprediction Rules can be obtained from three different kindsof sources human experts numerical data and neural net-works All the obtained rules can be used in rule selectionmethod to obtain a smaller linguistic rule-based systemwith a higher performance We explain the fuzzy arithmetic-based [31] approach to linguistic rule extraction from trainedEFuNN for modeling problems using some computer simu-lations Assume that our training output was five rules for thenonlinear function realized by the trained neural network weassume that the five terms (rules) are given for each of the twoinput variables We also assume that the same five terms aregiven for the output variable

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

ISRN Artificial Intelligence 3

Real valuesinput layer

Bias

Conditionelements

layer(119878119888 119886119888 119874119888)

Rulelayer

(119878119877 119886119877 119874119877)

Actionelements

layer(119878119860 119886119860 119874119860)

Real valuesoutputlayer

(119878119900 119886119900 119874119900)1198601

1198602

1198611

1198612

1198621

1198622

1198831

1198832

DI connections CF connectionsDI11

DI12 DI21

DI22

1198771

1198772

119862119886

CF1

CF2

Figure 1 FuNN architecture

References [14 25] describe some techniques to learnfuzzy structure and parameters In recent years several evolv-ing neurofuzzy systems (ENFSs) have been simulated andthese systems use online learning algorithms that can extractknowledge from data and perform a high-level adaptation ofthe network structure as well as learning of parameters

312TheGeneral Fuzzy Neural Network (FuNN) ArchitectureThe fuzzy neural network (FuNN) (Figure 1) is connectionistfeed-forward architecture with five layers of neurons and fourlayers of connections [26] The first layer of neurons receivesinput information The second layer calculates the fuzzymembership degrees of the input values which belong topredefined fuzzy membership functions for example smallmedium and large The third layer of neurons representsassociations between the input and the output variables fuzzyIf-Then rulesThe fourth layer calculates the degrees to whichoutput membership functions are matched by the input dataand the fifth layer does defuzzification and calculates valuesfor the output variables An FuNN has both the features ofa neural network and a fuzzy inference machine Severaltraining algorithms have been developed for FuNN [26] amodified back-propagation algorithm a genetic algorithmstructural learning with forgetting training and zeroingcombinedmodes Several algorithms for rule extraction fromFuNN have also been developed and applied Each of themrepresents each rule node of a trained FuNN as an If-Thenfuzzy rule

32 EFUNN Architecture EFuNNs are FuNN structures thatevolve according to the Evolving Connectionist Systems(ECOS) principles That is all nodes in an EFuNN are gen-erated through learningThenodes representingmembershipfunctions can bemodified during learning As in FuNN eachinput variable is represented here by a group of spatiallyarranged neurons to represent different fuzzy domain areasof this variable Different membership functions can beattached to these neurons (triangular Gaussian etc) Newneurons evolve in this layer if for a given input vector thecorresponding variable value does not belong to any of

the existing membership functions to a membership degreegreater than a membership threshold and this means thatnew fuzzy label neuron or an input variable neuron can becreated during the adaptation phase of an EFuNN

33 Yield Records and Climate Data Crop records and clim-ate data from a tomato greenhouse operation inWSG (WightSalads Group) in the Isle of Wight United Kingdom wereused to design and train evolving fuzzy neural networks foryield prediction We used fuzzy inference system for imp-lementing input parameter characterization The data in-cludes six datasets from two production cycles (S1 2004to 2007 and S2 2008) and one greenhouse section (NewSite) The total number of records is 1286 and each recordincluded 14 parameters characterizing the weekly greenhouseenvironment and the crop features

The environmental parameters which are to be controlledare the vapor pressure deficit (VPD) and the differentialtemperature between the daytime to nighttime (119879

119889minus 119879119899)

The setpoints for each environmental treatment were VPD =20 kPa 06 kPa and uncontrolled 119879

119889119879119899= 26∘C18∘C 20∘C

20∘C and 22∘C18∘C respectively for each compartmentThedate when they were planted and the period over which thecrops were allowed to grow in the case of both datasets aresummarized in Table 1

The tomatoes were grown in greenhouses on a high-wire system with hydroponic CO

2supplementation and

computer climateirrigation control The greenhouses wereequipped with hot-water heating pipes and roof vents forpassive cooling

The crop records consisted of 12 plant samples per green-house section that were randomly selected and continuouslymeasured during the production cycle The crop record datawere collected by direct observation and by manually mea-suring or counting each of the morphological features Thissystem included an electronic sensor unit which measuredthe air temperature humidity andCO

2concentration in each

of the greenhouse sections Outside weather conditions weredetermined via a weather station Daytime nighttime and24 h averages of daily climate data from outside and insideof the greenhouse section were also obtained by the growerweekly averages were computed to match the weekly croprecords

34 Modeling the Parameters Yield (119884119886 kgmminus2 weekminus1) is of

interest to greenhouse growers as a means via which theycan develop short-term crop management strategies it isalso useful for labour management strategies Greenhousetomato cumulative yield can be described either as freshweight (Cockshull 1988) or as dry mass [27] Both of thesestudies were performed in northern latitudes without CO

2

enrichment for short production periods (lt100 days) andthe plants were cultivated in soil not hydroponically Theseare not currently standard cultivation methods because mostof the greenhouse growers make use of high-technologyproduction facilities

Yield development is influencedmainly by fruit tempera-ture This parameter is inversely related to fruit development

4 ISRN Artificial Intelligence

Table 1 Summary of datasets Crop records include samples per week in New House Sites

Season (years) Greenhouse Transplanted on WOYa and date Crop durationb (weeks) CultivarCommercial DS

1 New h21 (26) June 24 2004 42 Campari(2004-2005) New h22 (28) July 8 2004 47 Campari

2 New h21 (31) July 28 2005 43 Campari(2004ndash2007) New h22 (25) June 16 2005 44 Campari

New h22 (33) Aug 11 2006 36 CherryNew h22 (41) Oct 6 2007 39 Cherry

a bWOY week of the year number

for each evolving layer neuron h doCreate a new rule rfor each input neuron i do

Find the condition neuron c with the largestweight119882

119888ℎ

Add an antecedent to r of the form ldquoi is c119882119888ℎrdquo where119882

119888119905is the confidence factor for

that antecedentend forfor each output neuron o do

Find the action neuron a with the largest weight119882ℎ119886

Add a consequent to r of the form ldquoo is a119882ℎ119886rdquo where119882

ℎ119886is the confidence factor for

that consequentend for

end for

Algorithm 1 EFuNN algorithm steps

rate and shows a linear relationship with air temperature[28 29] This relationship is shown in Algorithm 1 for allthe datasets which show the relation between yield and airtemperature (119879119894

11989924)

341 Data Processing The steps of the preprocessing includemaking average through a certain amount of the certainpoint of some data recordsWe preprocessed 5 environmentalvariables (CO

2 temperature vapor pressure deficit (VPD)

yield and radiation) for different tomato cultivars fromdifferent greenhouses inWSG area which were not ready forprocessing but had to be pre-processed For instance somevalues in some tomato records were missing and we had toreplace them with 0 being not ready to be fed to an artificialneural network for processing Thus three preprocessingsteps were taken

(1) Edit each data file and group same tomato cultivar andtype in one filewith all environmental variables of thatcultivar gathered from different greenhouses

(2) For each cultivar store values in a Microsoft Excelspreadsheet Next in the spreadsheet 0 replaced nullvalues For some VPD and radiation missing valuesare averaged

(3) The Excel spreadsheet content was converted todatfile input to be fed into Matlab and EFUNN applica-tion

35 Neural NetworkModel ComputationalNNs have provento be a powerful tool to solve several types of problemsin various real life fields where approximation of nonlinearfunctions classification identification and pattern recogni-tion are required NNs are mathematical representations ofbiological neurons in the way that they process informationas parallel computing units In general there are two typesof neural network architecture (1) static (feedforward) inwhich no feedback or time delays exist and (2) dynamicneural networks whose outputs depend on the current orprevious inputs outputs or states of the network (Demuthet al [30])

We consider a neural network that consists of an inputlayer with 119899thorn1 nodes a hidden layer with h units and anoutput layer with l units as follows

119910119894= 119892(

sum119895=1

119908119894119895119891(

119899+1

sum119896=1

V119895119896119883119896)) 119894 = 1 2 119897 (2)

where xk indicates the kth input value 119910119894 the 119894th outputvalue vjk a weight connecting the kth input node with

ISRN Artificial Intelligence 5

Fuzzification

Inputs

Rule nodes Defuzzification

Outputs

Figure 2 Simple EFuNN structure

the jth hidden unit and wij a weight between the jthhidden unit and the 119894th output unit We start learning withone hidden layer which would be updating the weightsand thresholds to minimize the objective function by anyoptimization algorithm then terminate the network trainingwith this number of hidden units when a local minimumof the objective function has been reached If the desiredaccuracy is not reached increase the number of hidden unitswith random weights and thresholds and repeat updatingotherwise finish training and stop

36 The EFuNN Algorithm EFuNN consists of a five-layerstructure which begins with the input layer representinginput variables and each input variable is a presentation ofgroup of arranged neurons representing a fuzzy quantizationof this variable which is then presented to the secondlayer of nodes Different membership functions (MFs) canbe attached to these neurons (triangular Gaussian etc)where continuing modifications of nodes with a membershipfunctions can be applied during learning

Through node creation and consecutive aggregation anEFuNN system can adjust over time to changes in the datastream and at the same time preserve its generalizationcapabilities If EFuNNs use linear equations to calculatethe activation of the rule nodes (instead of using Gaussianfunctions and exponential functions) the EFuNN learningprocedure is faster EFuNN also produces a better onlinegeneralization which is a result of more accurate nodeallocation during the learning process As Algorithm 1 showsEFuNN allows for continuous training on new data furthertesting and also training of the EFuNN on the test data inan online mode leading to a significant improvement of theaccuracy

The third layer contains rule nodes that evolve throughhybrid-supervised learning These rule nodes present proto-types of input-output data associations where each rule nodeis defined by two vectors of connection weights 1198821(119903) and1198822(119903) also W2 is adjusted through learning depending onoutput error andW1 is modified based on similarity measurewithin input space in the local area

The fourth layer of neurons represents fuzzy quantifica-tion of the output variables similar to the input fuzzy neuronsrepresentation The fifth layer represents the real values forthe output variables In the case of a ldquoone-of-nrdquomode EFuNN

transmits the maximum activation of the rule to the nextlevel In the case of a ldquomany-of-nrdquomode the activation valuesofm (119898 gt 1) rule nodes that are above an activation thresholdare transmitted further in the connectionist structure

In this paper EFuNNrsquos Figure 2 shows the evolvingalgorithm is based on the principle that rule nodes only existif they are needed As each training example is presented theactivation values of the nodes the rule and action layers andthe error over the action nodes are examined if themaximumrule node activation is below a set threshold then a rule nodeis added If the action node error is above a threshold value arule node is added Finally if the radius of the updated nodeis larger than a radius threshold then the updating process isundone and a rule node is added

EFuNN has several parameters to be optimized and theyare

(1) number of input and output(2) learning rate forW1 andW2(3) pruning control(4) aggregation control(5) number of membership functions(6) shape of membership functions(7) initial sensitivity threshold(8) maximum radius(9) M-of-n value

37 InputOutput Parameters The architecture and trainingmode of EFuNNdepends on the input and output parametersas determined by the problem being solved The tomato cropdata includes biological entities that show nonlinear dynamicbehaviour and whose response depends not only on severalenvironmental factors but also on the current and previouscrop conditions Greenhouse tomatoes have long productioncycles and the total yield is determined by these parametersThe effects of several environmental factors (light CO

2 air

humidity and air temperature) have both short- and long-term impacts on tomato plants Air temperature directlyaffects fruit growth and according to [28] it is the influentialparameter on the growth process Figure 3

Greenhouse tomatoes have a variable fruit growth periodranging from 40 to 67 days This deviation results from thechanging 24-hour average air temperature in the greenhouseHere the objective was to design an EFuNN that is simpleenough and accurate enough to predict the variables ofinterest since for many practical problems variables have

119863119899=(12) (sum

119888

119894=1

1003816100381610038161003816119868119894 minus1198821198941198991003816100381610038161003816)

sum119888

119894=1119882119894119899

(3)

different levels of importance and make different contribu-tions to the output Also it is necessary to find an optimalnormalization and assign proper importance factors to thevariables reduce the size of input vectors and keep only themost important variablesThis dynamic network architecturewas chosen because of its memory association and learning

6 ISRN Artificial Intelligence

0 5 10 15 20 25

12

1

08

06

04

02

0

Yiel

d (k

g)

Temperature

Figure 3 24 h average air temperature relationshipwith yield grownin tomatoes greenhouse

capability with sequential and time-varying patterns whichis most likely the biological situation for tomato plants

4 Experiments Setup Training andPerformance Evaluation

The training process implies iterative adjustment of the biasesof the network by minimizing a performance function whenpresenting input and target vectors to the networkThemeansquare error (MSE) selected as the performance function(Table 2) was calculated as the difference between the targetoutput and the network output

The supervised learning in EFuNN is built on thepreviously explained principles so that when a new dataexample is presented the EFuNN creates a new rule nodeto memorize the two input and output fuzzy vectors andoradjusts the winning rule node After a certain number ofexamples are applied some neurons and connections may bepruned or aggregated and only the best are chosen Differentpruning rules can be applied in order to realize the mostsuccessful pruning of unnecessary nodes and connectionsAlthough there are several options for growing the EFuNNwe restricted the learning algorithm to the 1-of-n algorithm

Using Gaussian functions and exponential functions theEFuNN learning procedure produces a better online gener-alization which is a result of more accurate node allocationduring the learning process EFuNN allowed continuoustraining on new data we did further testing and also trainingof the EFuNN on the test data in an online mode whichled to a significant improvement in accuracy A significantadvantage of EFuNNs is the local learning which allowed afast adaptive learning where only few connections and nodesare changed or created after the entry of each new data itemThis is in contrast to the global learning algorithmswhere foreach input vector all connection weights changed

Tomato plants were included in the training testingand validation process The initial datasets were randomlydivided so that 60 (771) of the records were assigned to thetraining set 20 (257) to the validation set and 20 (257)to the test set The generalization in each of the networks

Table 2 Test results and performance comparison of demandforecasting

EFuNN ANN(Multilayer perceptron)

Learning epochs 1 2500Training error(RMSE) 00013 0116

Testing error(RMSE) 00092 0118

Computational load(in billion flops) 0536 872

Table 3 Results from 4 runs of the EFuNN

RMS (training) No of rule nodes Accuracy ()00045 77 829500023 48 860000029 75 849100028 81 8079

was improved by implementing the early stopping methodthrough the validation set After training each of the networksto a satisfactory performance the independent validationset was used to evaluate the prediction performance and topresent the results

We initialized 119909(119905) 119909(119905 minus 1) 119909(119905 minus 2) 119909(119905 minus 119899) in orderto predict the 119909(119905 + 1) and the training was replicated threetimes using three different samples of training data anddifferent combinations of network parameters We used fourGaussianMFs for each input variable as well as the followingevolving parameters number of membership functions = 3sensitivity threshold 119878119905ℎ119903 = 099 learning rate = 025 errorthreshold 119864119903119903119905ℎ119903 = 0001 learning rates for first and secondlayer = 005 EFuNN uses a one-pass training approachThe network parameters were determined using a trial anderror approach The training was repeated three times afterreinitializing the network and the worst errors were reportedin Figure 5 Online learning in EFuNN resulted in creating2122 rule nodes as depicted in Figure 6 We illustrate theEFuNN training results and the training performance issummarized in Table 3

An investigation of the extracted rule set shown inFigure 4 from run 2 indicates that different compartmentsfrom the mentioned environmental variables affected yieldprediction Rules can be obtained from three different kindsof sources human experts numerical data and neural net-works All the obtained rules can be used in rule selectionmethod to obtain a smaller linguistic rule-based systemwith a higher performance We explain the fuzzy arithmetic-based [31] approach to linguistic rule extraction from trainedEFuNN for modeling problems using some computer simu-lations Assume that our training output was five rules for thenonlinear function realized by the trained neural network weassume that the five terms (rules) are given for each of the twoinput variables We also assume that the same five terms aregiven for the output variable

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

4 ISRN Artificial Intelligence

Table 1 Summary of datasets Crop records include samples per week in New House Sites

Season (years) Greenhouse Transplanted on WOYa and date Crop durationb (weeks) CultivarCommercial DS

1 New h21 (26) June 24 2004 42 Campari(2004-2005) New h22 (28) July 8 2004 47 Campari

2 New h21 (31) July 28 2005 43 Campari(2004ndash2007) New h22 (25) June 16 2005 44 Campari

New h22 (33) Aug 11 2006 36 CherryNew h22 (41) Oct 6 2007 39 Cherry

a bWOY week of the year number

for each evolving layer neuron h doCreate a new rule rfor each input neuron i do

Find the condition neuron c with the largestweight119882

119888ℎ

Add an antecedent to r of the form ldquoi is c119882119888ℎrdquo where119882

119888119905is the confidence factor for

that antecedentend forfor each output neuron o do

Find the action neuron a with the largest weight119882ℎ119886

Add a consequent to r of the form ldquoo is a119882ℎ119886rdquo where119882

ℎ119886is the confidence factor for

that consequentend for

end for

Algorithm 1 EFuNN algorithm steps

rate and shows a linear relationship with air temperature[28 29] This relationship is shown in Algorithm 1 for allthe datasets which show the relation between yield and airtemperature (119879119894

11989924)

341 Data Processing The steps of the preprocessing includemaking average through a certain amount of the certainpoint of some data recordsWe preprocessed 5 environmentalvariables (CO

2 temperature vapor pressure deficit (VPD)

yield and radiation) for different tomato cultivars fromdifferent greenhouses inWSG area which were not ready forprocessing but had to be pre-processed For instance somevalues in some tomato records were missing and we had toreplace them with 0 being not ready to be fed to an artificialneural network for processing Thus three preprocessingsteps were taken

(1) Edit each data file and group same tomato cultivar andtype in one filewith all environmental variables of thatcultivar gathered from different greenhouses

(2) For each cultivar store values in a Microsoft Excelspreadsheet Next in the spreadsheet 0 replaced nullvalues For some VPD and radiation missing valuesare averaged

(3) The Excel spreadsheet content was converted todatfile input to be fed into Matlab and EFUNN applica-tion

35 Neural NetworkModel ComputationalNNs have provento be a powerful tool to solve several types of problemsin various real life fields where approximation of nonlinearfunctions classification identification and pattern recogni-tion are required NNs are mathematical representations ofbiological neurons in the way that they process informationas parallel computing units In general there are two typesof neural network architecture (1) static (feedforward) inwhich no feedback or time delays exist and (2) dynamicneural networks whose outputs depend on the current orprevious inputs outputs or states of the network (Demuthet al [30])

We consider a neural network that consists of an inputlayer with 119899thorn1 nodes a hidden layer with h units and anoutput layer with l units as follows

119910119894= 119892(

sum119895=1

119908119894119895119891(

119899+1

sum119896=1

V119895119896119883119896)) 119894 = 1 2 119897 (2)

where xk indicates the kth input value 119910119894 the 119894th outputvalue vjk a weight connecting the kth input node with

ISRN Artificial Intelligence 5

Fuzzification

Inputs

Rule nodes Defuzzification

Outputs

Figure 2 Simple EFuNN structure

the jth hidden unit and wij a weight between the jthhidden unit and the 119894th output unit We start learning withone hidden layer which would be updating the weightsand thresholds to minimize the objective function by anyoptimization algorithm then terminate the network trainingwith this number of hidden units when a local minimumof the objective function has been reached If the desiredaccuracy is not reached increase the number of hidden unitswith random weights and thresholds and repeat updatingotherwise finish training and stop

36 The EFuNN Algorithm EFuNN consists of a five-layerstructure which begins with the input layer representinginput variables and each input variable is a presentation ofgroup of arranged neurons representing a fuzzy quantizationof this variable which is then presented to the secondlayer of nodes Different membership functions (MFs) canbe attached to these neurons (triangular Gaussian etc)where continuing modifications of nodes with a membershipfunctions can be applied during learning

Through node creation and consecutive aggregation anEFuNN system can adjust over time to changes in the datastream and at the same time preserve its generalizationcapabilities If EFuNNs use linear equations to calculatethe activation of the rule nodes (instead of using Gaussianfunctions and exponential functions) the EFuNN learningprocedure is faster EFuNN also produces a better onlinegeneralization which is a result of more accurate nodeallocation during the learning process As Algorithm 1 showsEFuNN allows for continuous training on new data furthertesting and also training of the EFuNN on the test data inan online mode leading to a significant improvement of theaccuracy

The third layer contains rule nodes that evolve throughhybrid-supervised learning These rule nodes present proto-types of input-output data associations where each rule nodeis defined by two vectors of connection weights 1198821(119903) and1198822(119903) also W2 is adjusted through learning depending onoutput error andW1 is modified based on similarity measurewithin input space in the local area

The fourth layer of neurons represents fuzzy quantifica-tion of the output variables similar to the input fuzzy neuronsrepresentation The fifth layer represents the real values forthe output variables In the case of a ldquoone-of-nrdquomode EFuNN

transmits the maximum activation of the rule to the nextlevel In the case of a ldquomany-of-nrdquomode the activation valuesofm (119898 gt 1) rule nodes that are above an activation thresholdare transmitted further in the connectionist structure

In this paper EFuNNrsquos Figure 2 shows the evolvingalgorithm is based on the principle that rule nodes only existif they are needed As each training example is presented theactivation values of the nodes the rule and action layers andthe error over the action nodes are examined if themaximumrule node activation is below a set threshold then a rule nodeis added If the action node error is above a threshold value arule node is added Finally if the radius of the updated nodeis larger than a radius threshold then the updating process isundone and a rule node is added

EFuNN has several parameters to be optimized and theyare

(1) number of input and output(2) learning rate forW1 andW2(3) pruning control(4) aggregation control(5) number of membership functions(6) shape of membership functions(7) initial sensitivity threshold(8) maximum radius(9) M-of-n value

37 InputOutput Parameters The architecture and trainingmode of EFuNNdepends on the input and output parametersas determined by the problem being solved The tomato cropdata includes biological entities that show nonlinear dynamicbehaviour and whose response depends not only on severalenvironmental factors but also on the current and previouscrop conditions Greenhouse tomatoes have long productioncycles and the total yield is determined by these parametersThe effects of several environmental factors (light CO

2 air

humidity and air temperature) have both short- and long-term impacts on tomato plants Air temperature directlyaffects fruit growth and according to [28] it is the influentialparameter on the growth process Figure 3

Greenhouse tomatoes have a variable fruit growth periodranging from 40 to 67 days This deviation results from thechanging 24-hour average air temperature in the greenhouseHere the objective was to design an EFuNN that is simpleenough and accurate enough to predict the variables ofinterest since for many practical problems variables have

119863119899=(12) (sum

119888

119894=1

1003816100381610038161003816119868119894 minus1198821198941198991003816100381610038161003816)

sum119888

119894=1119882119894119899

(3)

different levels of importance and make different contribu-tions to the output Also it is necessary to find an optimalnormalization and assign proper importance factors to thevariables reduce the size of input vectors and keep only themost important variablesThis dynamic network architecturewas chosen because of its memory association and learning

6 ISRN Artificial Intelligence

0 5 10 15 20 25

12

1

08

06

04

02

0

Yiel

d (k

g)

Temperature

Figure 3 24 h average air temperature relationshipwith yield grownin tomatoes greenhouse

capability with sequential and time-varying patterns whichis most likely the biological situation for tomato plants

4 Experiments Setup Training andPerformance Evaluation

The training process implies iterative adjustment of the biasesof the network by minimizing a performance function whenpresenting input and target vectors to the networkThemeansquare error (MSE) selected as the performance function(Table 2) was calculated as the difference between the targetoutput and the network output

The supervised learning in EFuNN is built on thepreviously explained principles so that when a new dataexample is presented the EFuNN creates a new rule nodeto memorize the two input and output fuzzy vectors andoradjusts the winning rule node After a certain number ofexamples are applied some neurons and connections may bepruned or aggregated and only the best are chosen Differentpruning rules can be applied in order to realize the mostsuccessful pruning of unnecessary nodes and connectionsAlthough there are several options for growing the EFuNNwe restricted the learning algorithm to the 1-of-n algorithm

Using Gaussian functions and exponential functions theEFuNN learning procedure produces a better online gener-alization which is a result of more accurate node allocationduring the learning process EFuNN allowed continuoustraining on new data we did further testing and also trainingof the EFuNN on the test data in an online mode whichled to a significant improvement in accuracy A significantadvantage of EFuNNs is the local learning which allowed afast adaptive learning where only few connections and nodesare changed or created after the entry of each new data itemThis is in contrast to the global learning algorithmswhere foreach input vector all connection weights changed

Tomato plants were included in the training testingand validation process The initial datasets were randomlydivided so that 60 (771) of the records were assigned to thetraining set 20 (257) to the validation set and 20 (257)to the test set The generalization in each of the networks

Table 2 Test results and performance comparison of demandforecasting

EFuNN ANN(Multilayer perceptron)

Learning epochs 1 2500Training error(RMSE) 00013 0116

Testing error(RMSE) 00092 0118

Computational load(in billion flops) 0536 872

Table 3 Results from 4 runs of the EFuNN

RMS (training) No of rule nodes Accuracy ()00045 77 829500023 48 860000029 75 849100028 81 8079

was improved by implementing the early stopping methodthrough the validation set After training each of the networksto a satisfactory performance the independent validationset was used to evaluate the prediction performance and topresent the results

We initialized 119909(119905) 119909(119905 minus 1) 119909(119905 minus 2) 119909(119905 minus 119899) in orderto predict the 119909(119905 + 1) and the training was replicated threetimes using three different samples of training data anddifferent combinations of network parameters We used fourGaussianMFs for each input variable as well as the followingevolving parameters number of membership functions = 3sensitivity threshold 119878119905ℎ119903 = 099 learning rate = 025 errorthreshold 119864119903119903119905ℎ119903 = 0001 learning rates for first and secondlayer = 005 EFuNN uses a one-pass training approachThe network parameters were determined using a trial anderror approach The training was repeated three times afterreinitializing the network and the worst errors were reportedin Figure 5 Online learning in EFuNN resulted in creating2122 rule nodes as depicted in Figure 6 We illustrate theEFuNN training results and the training performance issummarized in Table 3

An investigation of the extracted rule set shown inFigure 4 from run 2 indicates that different compartmentsfrom the mentioned environmental variables affected yieldprediction Rules can be obtained from three different kindsof sources human experts numerical data and neural net-works All the obtained rules can be used in rule selectionmethod to obtain a smaller linguistic rule-based systemwith a higher performance We explain the fuzzy arithmetic-based [31] approach to linguistic rule extraction from trainedEFuNN for modeling problems using some computer simu-lations Assume that our training output was five rules for thenonlinear function realized by the trained neural network weassume that the five terms (rules) are given for each of the twoinput variables We also assume that the same five terms aregiven for the output variable

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

ISRN Artificial Intelligence 5

Fuzzification

Inputs

Rule nodes Defuzzification

Outputs

Figure 2 Simple EFuNN structure

the jth hidden unit and wij a weight between the jthhidden unit and the 119894th output unit We start learning withone hidden layer which would be updating the weightsand thresholds to minimize the objective function by anyoptimization algorithm then terminate the network trainingwith this number of hidden units when a local minimumof the objective function has been reached If the desiredaccuracy is not reached increase the number of hidden unitswith random weights and thresholds and repeat updatingotherwise finish training and stop

36 The EFuNN Algorithm EFuNN consists of a five-layerstructure which begins with the input layer representinginput variables and each input variable is a presentation ofgroup of arranged neurons representing a fuzzy quantizationof this variable which is then presented to the secondlayer of nodes Different membership functions (MFs) canbe attached to these neurons (triangular Gaussian etc)where continuing modifications of nodes with a membershipfunctions can be applied during learning

Through node creation and consecutive aggregation anEFuNN system can adjust over time to changes in the datastream and at the same time preserve its generalizationcapabilities If EFuNNs use linear equations to calculatethe activation of the rule nodes (instead of using Gaussianfunctions and exponential functions) the EFuNN learningprocedure is faster EFuNN also produces a better onlinegeneralization which is a result of more accurate nodeallocation during the learning process As Algorithm 1 showsEFuNN allows for continuous training on new data furthertesting and also training of the EFuNN on the test data inan online mode leading to a significant improvement of theaccuracy

The third layer contains rule nodes that evolve throughhybrid-supervised learning These rule nodes present proto-types of input-output data associations where each rule nodeis defined by two vectors of connection weights 1198821(119903) and1198822(119903) also W2 is adjusted through learning depending onoutput error andW1 is modified based on similarity measurewithin input space in the local area

The fourth layer of neurons represents fuzzy quantifica-tion of the output variables similar to the input fuzzy neuronsrepresentation The fifth layer represents the real values forthe output variables In the case of a ldquoone-of-nrdquomode EFuNN

transmits the maximum activation of the rule to the nextlevel In the case of a ldquomany-of-nrdquomode the activation valuesofm (119898 gt 1) rule nodes that are above an activation thresholdare transmitted further in the connectionist structure

In this paper EFuNNrsquos Figure 2 shows the evolvingalgorithm is based on the principle that rule nodes only existif they are needed As each training example is presented theactivation values of the nodes the rule and action layers andthe error over the action nodes are examined if themaximumrule node activation is below a set threshold then a rule nodeis added If the action node error is above a threshold value arule node is added Finally if the radius of the updated nodeis larger than a radius threshold then the updating process isundone and a rule node is added

EFuNN has several parameters to be optimized and theyare

(1) number of input and output(2) learning rate forW1 andW2(3) pruning control(4) aggregation control(5) number of membership functions(6) shape of membership functions(7) initial sensitivity threshold(8) maximum radius(9) M-of-n value

37 InputOutput Parameters The architecture and trainingmode of EFuNNdepends on the input and output parametersas determined by the problem being solved The tomato cropdata includes biological entities that show nonlinear dynamicbehaviour and whose response depends not only on severalenvironmental factors but also on the current and previouscrop conditions Greenhouse tomatoes have long productioncycles and the total yield is determined by these parametersThe effects of several environmental factors (light CO

2 air

humidity and air temperature) have both short- and long-term impacts on tomato plants Air temperature directlyaffects fruit growth and according to [28] it is the influentialparameter on the growth process Figure 3

Greenhouse tomatoes have a variable fruit growth periodranging from 40 to 67 days This deviation results from thechanging 24-hour average air temperature in the greenhouseHere the objective was to design an EFuNN that is simpleenough and accurate enough to predict the variables ofinterest since for many practical problems variables have

119863119899=(12) (sum

119888

119894=1

1003816100381610038161003816119868119894 minus1198821198941198991003816100381610038161003816)

sum119888

119894=1119882119894119899

(3)

different levels of importance and make different contribu-tions to the output Also it is necessary to find an optimalnormalization and assign proper importance factors to thevariables reduce the size of input vectors and keep only themost important variablesThis dynamic network architecturewas chosen because of its memory association and learning

6 ISRN Artificial Intelligence

0 5 10 15 20 25

12

1

08

06

04

02

0

Yiel

d (k

g)

Temperature

Figure 3 24 h average air temperature relationshipwith yield grownin tomatoes greenhouse

capability with sequential and time-varying patterns whichis most likely the biological situation for tomato plants

4 Experiments Setup Training andPerformance Evaluation

The training process implies iterative adjustment of the biasesof the network by minimizing a performance function whenpresenting input and target vectors to the networkThemeansquare error (MSE) selected as the performance function(Table 2) was calculated as the difference between the targetoutput and the network output

The supervised learning in EFuNN is built on thepreviously explained principles so that when a new dataexample is presented the EFuNN creates a new rule nodeto memorize the two input and output fuzzy vectors andoradjusts the winning rule node After a certain number ofexamples are applied some neurons and connections may bepruned or aggregated and only the best are chosen Differentpruning rules can be applied in order to realize the mostsuccessful pruning of unnecessary nodes and connectionsAlthough there are several options for growing the EFuNNwe restricted the learning algorithm to the 1-of-n algorithm

Using Gaussian functions and exponential functions theEFuNN learning procedure produces a better online gener-alization which is a result of more accurate node allocationduring the learning process EFuNN allowed continuoustraining on new data we did further testing and also trainingof the EFuNN on the test data in an online mode whichled to a significant improvement in accuracy A significantadvantage of EFuNNs is the local learning which allowed afast adaptive learning where only few connections and nodesare changed or created after the entry of each new data itemThis is in contrast to the global learning algorithmswhere foreach input vector all connection weights changed

Tomato plants were included in the training testingand validation process The initial datasets were randomlydivided so that 60 (771) of the records were assigned to thetraining set 20 (257) to the validation set and 20 (257)to the test set The generalization in each of the networks

Table 2 Test results and performance comparison of demandforecasting

EFuNN ANN(Multilayer perceptron)

Learning epochs 1 2500Training error(RMSE) 00013 0116

Testing error(RMSE) 00092 0118

Computational load(in billion flops) 0536 872

Table 3 Results from 4 runs of the EFuNN

RMS (training) No of rule nodes Accuracy ()00045 77 829500023 48 860000029 75 849100028 81 8079

was improved by implementing the early stopping methodthrough the validation set After training each of the networksto a satisfactory performance the independent validationset was used to evaluate the prediction performance and topresent the results

We initialized 119909(119905) 119909(119905 minus 1) 119909(119905 minus 2) 119909(119905 minus 119899) in orderto predict the 119909(119905 + 1) and the training was replicated threetimes using three different samples of training data anddifferent combinations of network parameters We used fourGaussianMFs for each input variable as well as the followingevolving parameters number of membership functions = 3sensitivity threshold 119878119905ℎ119903 = 099 learning rate = 025 errorthreshold 119864119903119903119905ℎ119903 = 0001 learning rates for first and secondlayer = 005 EFuNN uses a one-pass training approachThe network parameters were determined using a trial anderror approach The training was repeated three times afterreinitializing the network and the worst errors were reportedin Figure 5 Online learning in EFuNN resulted in creating2122 rule nodes as depicted in Figure 6 We illustrate theEFuNN training results and the training performance issummarized in Table 3

An investigation of the extracted rule set shown inFigure 4 from run 2 indicates that different compartmentsfrom the mentioned environmental variables affected yieldprediction Rules can be obtained from three different kindsof sources human experts numerical data and neural net-works All the obtained rules can be used in rule selectionmethod to obtain a smaller linguistic rule-based systemwith a higher performance We explain the fuzzy arithmetic-based [31] approach to linguistic rule extraction from trainedEFuNN for modeling problems using some computer simu-lations Assume that our training output was five rules for thenonlinear function realized by the trained neural network weassume that the five terms (rules) are given for each of the twoinput variables We also assume that the same five terms aregiven for the output variable

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

6 ISRN Artificial Intelligence

0 5 10 15 20 25

12

1

08

06

04

02

0

Yiel

d (k

g)

Temperature

Figure 3 24 h average air temperature relationshipwith yield grownin tomatoes greenhouse

capability with sequential and time-varying patterns whichis most likely the biological situation for tomato plants

4 Experiments Setup Training andPerformance Evaluation

The training process implies iterative adjustment of the biasesof the network by minimizing a performance function whenpresenting input and target vectors to the networkThemeansquare error (MSE) selected as the performance function(Table 2) was calculated as the difference between the targetoutput and the network output

The supervised learning in EFuNN is built on thepreviously explained principles so that when a new dataexample is presented the EFuNN creates a new rule nodeto memorize the two input and output fuzzy vectors andoradjusts the winning rule node After a certain number ofexamples are applied some neurons and connections may bepruned or aggregated and only the best are chosen Differentpruning rules can be applied in order to realize the mostsuccessful pruning of unnecessary nodes and connectionsAlthough there are several options for growing the EFuNNwe restricted the learning algorithm to the 1-of-n algorithm

Using Gaussian functions and exponential functions theEFuNN learning procedure produces a better online gener-alization which is a result of more accurate node allocationduring the learning process EFuNN allowed continuoustraining on new data we did further testing and also trainingof the EFuNN on the test data in an online mode whichled to a significant improvement in accuracy A significantadvantage of EFuNNs is the local learning which allowed afast adaptive learning where only few connections and nodesare changed or created after the entry of each new data itemThis is in contrast to the global learning algorithmswhere foreach input vector all connection weights changed

Tomato plants were included in the training testingand validation process The initial datasets were randomlydivided so that 60 (771) of the records were assigned to thetraining set 20 (257) to the validation set and 20 (257)to the test set The generalization in each of the networks

Table 2 Test results and performance comparison of demandforecasting

EFuNN ANN(Multilayer perceptron)

Learning epochs 1 2500Training error(RMSE) 00013 0116

Testing error(RMSE) 00092 0118

Computational load(in billion flops) 0536 872

Table 3 Results from 4 runs of the EFuNN

RMS (training) No of rule nodes Accuracy ()00045 77 829500023 48 860000029 75 849100028 81 8079

was improved by implementing the early stopping methodthrough the validation set After training each of the networksto a satisfactory performance the independent validationset was used to evaluate the prediction performance and topresent the results

We initialized 119909(119905) 119909(119905 minus 1) 119909(119905 minus 2) 119909(119905 minus 119899) in orderto predict the 119909(119905 + 1) and the training was replicated threetimes using three different samples of training data anddifferent combinations of network parameters We used fourGaussianMFs for each input variable as well as the followingevolving parameters number of membership functions = 3sensitivity threshold 119878119905ℎ119903 = 099 learning rate = 025 errorthreshold 119864119903119903119905ℎ119903 = 0001 learning rates for first and secondlayer = 005 EFuNN uses a one-pass training approachThe network parameters were determined using a trial anderror approach The training was repeated three times afterreinitializing the network and the worst errors were reportedin Figure 5 Online learning in EFuNN resulted in creating2122 rule nodes as depicted in Figure 6 We illustrate theEFuNN training results and the training performance issummarized in Table 3

An investigation of the extracted rule set shown inFigure 4 from run 2 indicates that different compartmentsfrom the mentioned environmental variables affected yieldprediction Rules can be obtained from three different kindsof sources human experts numerical data and neural net-works All the obtained rules can be used in rule selectionmethod to obtain a smaller linguistic rule-based systemwith a higher performance We explain the fuzzy arithmetic-based [31] approach to linguistic rule extraction from trainedEFuNN for modeling problems using some computer simu-lations Assume that our training output was five rules for thenonlinear function realized by the trained neural network weassume that the five terms (rules) are given for each of the twoinput variables We also assume that the same five terms aregiven for the output variable

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

ISRN Artificial Intelligence 7

Figure 4 Some rules extracted from EFuNN simulation

The number of combinations of terms is 25 Each com-bination is presented to the trained neural network as alinguistic input vector119860119902 = 119860119902119894 1198601199022) The correspond-ing fuzzy output Oq is calculated by fuzzy arithmetic Thiscalculation is numerically performed for the h-level sets ofAq for ℎ = 01 02 10 The fuzzy output Oq is comparedwith each of the five linguistic termsThe linguistic term withthe minimum difference from the fuzzy output Oq is chosenas the consequent part Bq of the linguistic rule Rq with theantecedent partAq For example let us consider the followinglinguistic rule

Rule Rq If X1 is medium and X2 is small Then y is BqTo determine the consequent part Bq the antecedent part

of the linguistic rule Rq is presented to the trained neuralnetwork as the linguistic input vector The correspondingfuzzy output Oq is calculated

As a result the rules shown in Table 3 will be shrinkedinto the range of [5ndash10] rules which make it more efficientfor operators to work with

The results obtained from this study indicated that theEFuNN had successfully learnt the input variables and wasthen able to use these variables as a means of identifying theestimated yield amount

From the results obtained from testing the EFuNN andits relative performance against Multilayer Perceptron it isevident that the EFuNN models this task far better than anyof the other models In addition the rules extracted fromthe EFuNN reflect the nature of the training data set andconfirm the original hypothesis thatmore ruleswould need tobe evolved to get better predictions Accounting for the poorperformance of theMultilayer Perceptron could be explainedby examining the nature of this connectionist model Havinga fixed number of hidden nodes limits the number ofhyperplanes these models can use to separate the complexfeature space Selection of the optimal number of hiddennodes becomes a case of trial and error If these are too largethen the ability for the Multilayer Perceptron to generalizefor new instances of records is reduced due to the possibilityof overlearning the training examples This work has sought

Table 4 Accuracy results for different algorithms

Classifier C-week Week + 1 Week + 2MLP 81108 81309 78531RBF 7744 80371 78885EFuNN 79557 86257 83992

to describe a problematic area within horticulture producemanagement that of predicting yields in greenhouses TheEFuNN has clearly stood out as an appropriate mechanismfor this problem and has performed comparably well againstother methods But the most beneficial aspect of EFuNNarchitecture is the rule extraction ability By extracting therules from the EFuNN we can analyse why the EFuNNmade it more clear and identified deficiencies in its ability togeneralize to new unseen data instances

In terms of the size of the architecture for the MultilayerPerceptron this was extremely large This was because ofthe length of the input vector Input vectors of this sizerequire a significant amount of presentations of the trainingdata for the Multilayer Perceptron to successfully learn themapping between the input vectors and the output vectorsIn addition the small numbers of hidden nodes containedin the Multilayer Perceptron were unable to represent themapping between the inputs and outputs Raising the numberof hidden nodes increased the ability for the MultilayerPerceptron to learn but created a structure that was anorder of magnitude larger and thus took even longer totrain In conclusion it can be seen from the results of thisexperiment that the advantages of the EFuNN are twofoldOne the time taken to train the EFuNN was far less thanMultilayer Perceptron And two the EFuNN in Table 4 wasable to successfully predict yield for a given period of time inFigure 5This indicates that the EFuNNhas the ability to storea better representation of the temporal nature of the data andto this end generalize better than the other methods

Results shown in Table 4 show the accuracy of ourmethod compared to other classifiers like Bayesian RBF

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

8 ISRN Artificial Intelligence

0 10 20 30 40 50 60Week

2

15

1

05

0

minus05

Wei

ght (

kg)

DesiredActual

Figure 5 Training result with EFuNN

Network and so forth Results for those classifiers wereconstructed using Weka experimenter Weka is a collec-tion of machine learning algorithms for data mining tasksand the algorithms can either be applied directly to adata set or called from your own Java code Weka con-tains tools for data pre-processing classification regressionclustering association rules and visualization It is alsowell suited for developing new machine learning schemes(httpwwwcswaikatoacnzmlweka)

Performance of the learning algorithm is evaluated by theaccuracy () onCherry andCampari tomato data in differentgreenhouses for four years (2004ndash2007)

In this paper we have described how EFuNN can beapplied in the domain of horticulture especially in thechallenging area of deciding support for yield predictionwhich leads to the production of well-determined amountsThese results do not provide a mechanistic explanation ofthe factors influencing these fluctuations However knowingthis information in advance could be valuable for growers formaking decisions on climate and crop management Someof the advantages of the neural network model implementedin this study include the following (1)The input parametersof the model are currently recorded by most growers whichmakes the model easy to implement (2) the model canldquolearnrdquo from datasets with new scenarios (new cultivarsdifferent control strategies improved climate control etc)(3) less-experienced growers could use the system becausethe decision-making process of themost experienced growersis captured by the data used in the trained networks andproduction could thereby become more consistent

5 Conclusion

It is feasible to implement Intelligent System (IS) techniquesincluding NN and fuzzy logic for modeling and predictingof a greenhouse tomato production system Data from exper-imental trials and from a commercial operation for completeproduction cycles allowed the modeling of tomato yields IStechniques were robust in dealing with imprecise data andthey have a learning capability when presented with newscenarios and need to be tested on different tomato cultivars

0 10 20 30 40 50 60Week

25

2

15

1

05

0

Wei

ght (

kg)

DesiredActual

Figure 6 Test results and performance comparison of demandforecasts

Experimentation results show that EFuNN performedbetter than other techniques like ANN in terms of low RMSEerror and less computational loads (performance time) Asshowed in Figure 6 ANN training needsmore epochs (longertraining time) to achieve a better performance EFuNNmakes use of the knowledge of FIS and the learning doneby NN Hence the neurofuzzy system is able to preciselymodel the uncertainty and imprecision within the data aswell as to incorporate the learning ability of NN Even thoughthe performance of neurofuzzy systems is dependent on theproblems domain very often the results are better whilecompared to a pure neural network approach Comparedto NN an important advantage of neurofuzzy systems is itsreasoning ability (If-Then rules) within any particular stateA fully trained EFuNN could be replaced by a set of If-Thenrules

As EFuNN adopts a single pass training (1 epoch) it ismore adaptable and easy for further online training whichmight be highly useful for online forecasting However animportant disadvantage of EFuNN is the determination ofthe network parameters like number and type of MF for eachinput variable sensitivity threshold error threshold and thelearning rates

Similar procedures might be used to automatically adaptthe optimal combination of network parameters for theEFuNN

Acknowledgments

The authers would like to thank the Wight Salads Group(WSG) Company andWarwick Horticultural Research Insti-tute (WHRI) for providing the datasets which have been usedin this research and for their help and cooperation in pro-viding all the necessary information They acknowledge thesupport of the Horticulture Development Company (HDC)who has provided the PhD degree studentship support

References

[1] J W Jones E Dayan L H Allen H van Keulen andH ChallaldquoDynamic tomato growth and yieldmodel (TOMGRO)rdquoTrans-actions of the ASAE vol 34 no 2 pp 663ndash672 1991

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 9: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

ISRN Artificial Intelligence 9

[2] J W Jones A Kening and C E Vallejos ldquoReduced state-variable tomato growth modelrdquo Transactions of the ASAE vol42 no 1 pp 255ndash265 1999

[3] H Heuvelink ldquoGrowth development and yield of a tomatocrop periodic destructive measurements in a greenhouserdquoScientia Horticulturae vol 61 no 1-2 pp 77ndash99 1995

[4] EHeuvelinkTomato growth and yield quantitative analysis andsynthesis [PhD thesis] Wageningen Agricultural UniversityWageningen The Netherlands 1996

[5] P Abreu J F Meneses and C Gary ldquoTompousse a modelof yield prediction for tomato crops calibration study forunheated plastic greenhousesrdquo Acta Horticulturae vol 519 pp141ndash150 2000

[6] S R Adams ldquoPredicting the weekly fluctuations in glasshousetomato yieldsrdquo Acta Horticulturae vol 593 pp 19ndash23 2002

[7] R M de Moraes and L dos Santos Machado ldquoEvaluationsystem based on EFuNN for on-line training evaluation invirtual realityrdquo in Proceedings of the 10th Iberoamerican Congresson Pattern Recognition Image Analysis andApplications (CIARPrsquo05) vol 3773 of LectureNotes in Computer Science pp 778ndash7852005

[8] H U Frausto J G Pieters and J M Deltour ldquoModellinggreenhouse temperature by means of auto regressive modelsrdquoBiosystems Engineering vol 84 no 2 pp 147ndash157 2003

[9] R Linker I Seginer and P O Gutman ldquoOptimal CO2control

in a greenhousemodeledwith neural networksrdquoComputers andElectronics in Agriculture vol 19 no 3 pp 289ndash310 1998

[10] I Seginer ldquoSome artificial neural network applications togreenhouse environmental controlrdquo Computers and Electronicsin Agriculture vol 18 no 2-3 pp 167ndash186 1997

[11] I Seginer T Boulard and B J Bailey ldquoNeural network modelsof the greenhouse climaterdquo Journal of Agricultural EngineeringResearch vol 59 no 3 pp 203ndash216 1994

[12] P Salgado and J B Cunha ldquoGreenhouse climate hierarchicalfuzzy modellingrdquo Control Engineering Practice vol 13 no 5 pp613ndash628 2005

[13] H Ehrlich M Kuhne and J Jakel ldquoDevelopment of a fuzzycontrol system for greenhousesrdquo Acta Horticulturae vol 406pp 463ndash470 1996

[14] L A Zadeh ldquoFuzzy setsrdquo Information and Control vol 8 no 3pp 338ndash353 1965

[15] B T TienNeural-fuzzy approach for system identification [PhDthesis] Wageningen Agricultural University Wageningen TheNetherlands 1997

[16] B Center and B P Verma ldquoA fuzzy photosynthesis model fortomatordquo Transactions of the ASAE vol 40 no 3 pp 815ndash8211997

[17] M W Craven and J W Shavlik ldquoUsing sampling and queriesto extract rules from trained neural networksrdquo in Proceedings ofthe 11th International Conference on Machine Learning pp 37ndash45 1994

[18] HortiMax Village Farms USA Reports Record Tomato Produc-tion HortiMax Growing Solutions Rancho Santa MargaritaCalif USA 2008

[19] I Taha and J Ghosh ldquoThree techniques for extracting rulesfrom feedforward networksrdquo Intelligent Engineering Systemsthrough Artificial Neural Networks vol 6 pp 5ndash10 1996

[20] R Linker and I Seginer ldquoGreenhouse temperature modelinga comparison between sigmoid neural networks and hybridmodelsrdquoMathematics and Computers in Simulation vol 65 no1-2 pp 19ndash29 2004

[21] J Huysmans B Baesens and J Vanthienen Using Rule Extrac-tion to Improve the Comprehensibility of Predictive ModelsTechnical Report KBI 0612 vol 43 Department of DecisionSciences and InformationManagement KatholiekeUniversiteitLeuven Leuven Belgium 2006

[22] P Costa A quantified approach to tomato plant growth statusfor greenhouse production in a semi arid climate [PhD thesis]University of Arizona Tucson Ariz USA 2007

[23] N Kasabov ldquoAdaptable neuro production systemsrdquo Neurocom-puting vol 13 no 2ndash4 pp 95ndash117 1996

[24] N Kasabov Foundations of Neural Networks Fuzzy SystemsandKnowledge EngineeringMIT Press CambridgeMass USA1996

[25] M H Jensen ldquoSteering your tomatoes towards profitrdquo inGreenhouse Crop Production and Engineering Design ShortCourse Controlled EnvironmentAgricultureCenterUniversityof Arizona Tucson Ariz USA 2004

[26] J J Buckley and E Eslami An Introduction to Fuzzy Logic andFuzzy Sets Physica Heidelberg Germany 2002

[27] M T Hagan H B Demuth and M H Beale Neural NetworkDesign University of Colorado Boulder Colo USA 1995

[28] E Heuvelink ldquoDevelopmental processrdquo in Tomatoes EHeuvelink Ed Crop Production Science in HorticultureSeries pp 53ndash83 CABI Publishing Wallingford UK 2005

[29] E Heuvelink and M Dorais ldquoCrop growth and yieldrdquo inTomatoes E Heuvelink Ed Crop Production Science inHorticulture Series pp 85ndash144 CABI Publishing WallingfordUK 2005

[30] H Demuth M Beale and M Hagan Neural Network Toolbox5 Userrsquos Guide The MathWorks Inc Natick Mass USA 2007

[31] H Ishibuchi T Nakashima and M Nii Classification andModeling with Linguistic Information Granules AdvancedApproaches to Linguistic Data Mining Springer Berlin Ger-many 2004

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 10: Research Article Yield Prediction for Tomato Greenhouse Using …downloads.hindawi.com/archive/2013/430986.pdf · 2019. 7. 31. · scenarios where the greenhouse crops have a long

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014