[ieee 2014 third international conference on agro-geoinformatics - beijing, china...
TRANSCRIPT
The Observation Capability Reason of Optical Satellite Sensor for soil moisture monitoring
Xiaolei Wang, Nengcheng Chen, Xunliang Yang, Zeqiang Chen State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing
Wuhan University, Wuhan 430079, China [email protected], [email protected], [email protected], [email protected]
Abstract—The accuracy of soil moisture depends on these
sensors’ observation capability. Current methods have simple inference rules to provide obvious results for users, which have not clear connection between the sensors and specific application. This paper proposes an ontology and the reason rules of observation capability about optical satellite sensor, in order to mine deeper observation capability of sensors for soil moisture monitoring. The data in inversion model of soil moisture come from the information about band types in satellite sensors’ images. So we define some rules of band type to make it available. We build and manipulate the ontology of sensors based on the SSN for inference. The reasoning was executed using the Jena rule engine. They were applied to some sensors and inversion models to reason the observation capabilities. We execute this method in the application for better selection based on available sensors.
Keywords—observation capability reason; optical satellite sensor; soil moisture; sensor web; semantic web
I. INTRODUCTION Soil moisture could be monitored by diversity sensors,
which are mostly optical satellite sensor. Soil moisture information plays an important role in hydrological applications and disaster monitoring. The accuracy of soil moisture depends on these sensors’ observation capability. Different methods have been developed in order to assure the interoperability among reason of sensor observation [1-3]. They have simple inference rules to provide obvious results for users, which have not clear connection between the sensors and specific application.
Related work about sensor reason have some aspects, as follow:
Reasoning spatial and temporal information: In 2010, Koubradis and kysirakos [4] develop the data model stRDF and the query language stSPARQL for modelling spatial and temporal information in the Semantic Sensor Web.
Reasoning based on sensor observations for understanding weather events: Krishnaprasad et al. [5] explore how abductive reasoning framework can benefit formatlization and interpretation of sensor data to garner situation awareness. Calder et al. [1] describe a semantic data valiadation tool that is capable of observing incoming real-time sensor data and performing reasoning against a set of rules specific to the scientific domain to which the data belongs.
Reasoning related to capability: In 2011, Geeth et al. [6] propose a service-oriented reasoning architecture for resource-task assignment in Sensor Networks. Esswein et al. [7] presents an ontology-based approach for data quality inference on streaming observation data originating from large-scale sensor networks.
Reasoing based on ontology methodology: In 2013, Fernandez et al. [3] propose an ontology alignment architecture for Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies.
Besides that, there is a system which has good operability in application. Anantharam et al. [8] demonstrate SemMOB, which enables dynamic registration of sensors via mobile devices, search, and near real-time inference over sensor observations in ad-hoc mobile environments.
Current situation could be summarized as follows: spatial and temporal reasoning is matured; the data of reasoning mostly come from in-situ sensor; the rules is relatively simple for a weather event. It is hard to value the capability of sensors.
A large number of research papers have introduced a variety of methods to retrieve soil moisture information from different types of remote sensing data, such as optical data or radar data. [9]. For example, one of them is to investigate the possible use of satellite sensors data to extract soil moisture field and validate the developed systems by means of in-situ measurements[10].
This paper proposes an ontology and the reason rules of observation capability about optical satellite sensor, in order to mine deeper observation capability of sensors for soil moisture monitoring.
With this background on how Sensor Web, Soil Moisture, relate to sensors’ observation capability, we organize the remainder of the paper as follow. SectionⅡ discusses the approach to using reasoning with sensor web for soil moisture. Specifically, it describes the ontology to focus on the reason. Section Ⅲ proposes the reason rules. And Section Ⅳ describes our prototype implementation to detect the rules in soil moisture data. It also provides sample queries for illustrative purposes. Section Ⅴ concludes with suggestions for future work.
II. METHOD
A. Process In application, it is necessary to add temporal, spatial and
thematic filtering to query corresponding sensors which could monitor soil moisture. In this paper, we focus on thematic filtering, and omitted temporal and spatial filtering process. The overall framework is shown in Fig. 1.
Fig. 1. The overall framework
From top to bottom, the inversion of soil moisture requires different models which depend on vegetation-covered. There are some typical inversion models, such as, Apparent Thermal Inertia (ATI) [11] for low vegetation-covered and Vegetation Supply Water Index (VSWI) [12] for high vegetation-covered. These models define the related parameters. It is critical to solve the source of these parameters. When the connection between the models and sensors is built, soil moisture inversion can be achieved. We discover that the parameters in inversion model of soil moisture come from the information about band types in satellite sensors’ images.
From bottom to top, the observation capability of an optical sensor, e.g. MODIS, comes from the information of observation data. So the quality of data determine the observation capability. The related information of observation data are described in SensorML [13]. Specially, in SensorML, the range of each band about sensors are described. But which types of band in sensors are not clearly marked. But, it could make the sensor find corresponding inversion models of soil moisture through the inference rule. So we build domain ontology based on SSN [14], and define some rules of band type to make it available.
B. Realizing ontology extension based SSN Sensors can be modeled in the OGC Sensor Model
Language (SensorML) (Botts & Robin, 2007). The SensorML has a clear hierarchy in describing the sensor metadata framework.
This paper adopts the concepts and associations of SensorML and adds some concepts of observation data to build the ontology, which can be used to describe capability of heterogeneous sensors in the Sensor Web. In this paper, the ontology extension to SSN is to reuse concepts of SSN and commonly used sensor domain concepts. The extension focus on the observation capability of sensors, so we name the new ontology SSNOC.
The ontology is represented in a typed graph where a node represent a concept and a line describes a relationship between two concepts, respectively. To capture the meanings, properties and relationships of sensor resources, we use a special language (OWL) to describe the semantic information within the SSNOC. Moreover, the SSNOC could meet the requirements of certain users. The SSNOC pattern describes a sensor in terms of its information, data type, and process element type it takes.
As shown in Fig. 2, the class of Observationcapability is subclass of the Property class (ssnoc:Observationcapability ⊆ssn:Property). The class of ObservationProperty is subclass of the Property class (ssnoc:ObservationProperty ⊆ssn:Property). There are three sections of subclasses of ssnoc:ObservationProperty: Temporal section which represents the temporal information of sensors and observation; Spatial section which represents the spatial information of sensors and observation; Thematic section which represents the thematic information of sensors and observation.
Fig. 2. The structure of ontology
In order to express more information of sensors, the classes as shown in Fig. 2 contain some sub-classes, which are not showed the details in the paper because of limited space. Concepts and properties in the SSNOC are commented on with rdfs:comment, rdfs:isDefinedBy, rdfs:label, and rdfs:seeAlso. Where appropriate, concepts were included to enable linking to external ontologies that describe information type.
The ontology can describe some special information of sensors, including the observation capability such sensors. The
The Inversion Of Soil Moisture
High Vegetation-covered
Low Vegetation-covered
Vegetation Soil Water Index(VSWI)
Apparent Thermal Inertia(ATI)
EVI
Temperature Difference
Surface Albedo Surface Temperature
The surface emissivity
relevant parameterrelevant parameter
relevant parameterrelevant parameter
Derived
Observation Data of Optical Sensors
The Observation Capability of
Optical Sensors
MODIS
Instance
图例 Legend
SensorML
description method
Reflectance(NIR/R/B)
SensorML
IntendedApplication of Sensors
illustration
ontology can be used to describe instances of weather stations, hydrologic stations, and satellite sensors monitoring soil moisture. The structure can be used to describe the details of sensors relevant to a particular application. So users can find sensors through the field of intended application described in SSNOC.
III. REASON RULES In this paper, we omit temporal and spatial rules [4], and
focus on thematic rules. The thematic rules have three levels. The level 1 infers the band types in sensors and inversion model, respectively; according to the level 2, every sensors have a collection of inversion models; and through the pros and cons of inversion models, the sort of sensor observation capabilities could be presented by the level 3.
A. Rules level 1-BandCollection There are two definitions in this level, as shown in Fig. 3.
Bsensor-r means that the sensor has a Red band. Bsensor-n means that the sensor has a NIR band. Bsmm1-r means that the model requires a Red band.
1) Purpose: according to the definition 1, the type of band of sensors can be screen out.
Input: Sensor collection, e.g. Sensor= {Sensor1, Sensor2, ..., Sensor i,…}
Definition 1: Sensor-bandtype
Output: available band collection of each sensors, e.g. Bsensori={Bsensor-r,Bsensor-n}
2) Purpose: according to the definition 1, the type of band of inversion models can be screen out. SMM means the models collection for monitoring soil moisture.
Input: models collection for monitoring soil moisture, e.g. SMM= {Smm1, Smm2,..., Smmi,…}
Definition 2: Soil Moisture Models-bandtype
Output: available band collection of each models, e.g. Bsmm1={Bsmm1-r, Bsmm1-n,…}, Bsmm2={Bsmm2-r,Bsmm2-n,…},…, Bsmmi={Bsmmi-r,Bsmmi-n, …, }
B. Rules level 2-BandComparison There are one definition in level 2, as shown in Fig. 3.
Purpose: according to the definition 3, each sensor has available models to monitor soil moisture.
Input: the band collection of each Sensor and each model collection. In other words, the input of level 2 is the output
Definition 3: Sensor & Models-BandComparison
Output: determine whether the sensor can use the model to achieve the inversion of soil moisture. e.g. If the output is Bsensor = Bsmm, it means the sensor can use the model.
Specially, the condition for the end of the rule is to determine whether each sensor matches all models. So, the
final result is that every sensor has the corresponding model collection, e.g., Sensor1 has Sensor1-M= {SMM1,SMM2,SMM3,……}.
Fig. 3. The flow of rule level 1 and 2
C. Rules level 3-Sensors ranking according to Observation Capability Comparison The input of this level is the output of level 2: the band
collection of each sensor (Sensor1-M、Sensor2-M、Sensor3-M…)
The Procedure of rule level 3 is shown in Fig. 4.
1) Determine the vegetation-covered in the interest of area according to the calculated of NDVI, as shown in step 1 and 2 of Fig. 4.
2) According to the NDVI, the reason engine will choose the merits of sorting about high vegetation-covered or low vegetation-covered, as shown in step 3 of Fig. 4.
3) According to the merits of models, determine the ranking of sensors. There are the following situations, as shown in step 4-6 of Fig. 4.
a) Case 1: When two sensors have the same set of models, the spatial resolution will determine which one is better.
The Rules about the band type of sensors
Sensors CollectionSensor={Sensor1,Sensor2,...}
Soil M oisture M odel CollectionSM M ={Smm1,Smm2,...}
The Rules about the band type of Soil M oisture
The Band Collection about Sensors
Bsensor1={Bsensor1-r, Bsensor1-n…}Bsensor2={Bsensor2-r, Bsensor2-n…}
…...
The Band Collection about Soil M oisture
Bsmm1={Bsmm1-r, Bsmm1-n…}Bsmm2={Bsmm2-r, Bsmm2-n…}
…...
Sensor-BandType SoilM oisture-M odel-BandType
The Rules about band comparison between sensors and so il moisture
The corresponding model co llection about SensorsSensor1-m={Smm1, Smm2,...}Sensor2-m={Smm1, Smm3,…}
…...
For (i=1;i<=countSensor;i++)For (j=1;j<=countSmm;j++){ The comparison between
BsensorI and Bsmmj }
Sensor& SoilM oistureM odel-BandType
Level 1
Level 2
b) Case 2: When Sensor1-M has more than the other one, the sensor 1 have better observation capability.
c) Case 3: when two sensors have same number of models but different types, it will sorted by the pros and cons of models.
d) Case 4: when two sensors have different numbers and types of models, it will sorted by the pros and cons of models.
The output of rule level 3 is the ranking of sensors, according to the observation capability, e.g. OCSensor1-M>OCSsensor3-M>OCSsensor2-M> … . (OCSensor1-M means that the observation capability of sensor1 which determines by the inversion model.
Fig. 4. The flow of rule level 3
IV. EXPERIMENTS We used the Protégé-OWL API to build and manipulate the
ontology of sensors based on the SSN for inference. The reasoning were implemented using the Semantic Web Rule Language (SWRL) and were executed using the Jena rule engine. They were applied to some sensors and inversion models to reason the observation capabilities.
A. Data In this subsection, we list the data which are used in the
experiments. The data include the optical sensors, in-situ sensors and some soil moisture models. We use the optical sensors and soil moisture models to implement the reason rules, and the data of in-situ sensors to verify the results.
1) Optical sensors The list of main optical sensors is shown in TABLE I.
TABLE I. THE LIST OF OPTICAL SENSORS
Number
The name of sensor The platform of sensor
1 MODIS EOS-Terra
2 ETM+ Landsat-7 3 HRG SPOT-5 4 HSI HJ-1A 5 WVC HJ-1B 6 OLI Landsat-8 7 BGIS-2000 QuickBird 8 WFV GF-1 9 E2V Pleiades
2) In-situ sensors The results require the measured data to validate, so we laid
the sensors in the experimental field (Baoxie, Wuhan, China). The Map of Soil moisture Sensor Web in Baoxie is shown in Fig. 5. There are 27 sensors to monitor the soil moisture.
Fig. 5. The Map of Soil moisture Sensor Web in Baoxie
3) Soil moisture models The list of models is shown in TABLE II.
Ranking of Soil M oisture M odelH igh-vegetation covered smm1>smm2>...Low-vegetation covered smm1>smm3>...
The corresponding model collection about SensorsSensor1-m={Smm1, Smm2,...}Sensor2-m={Smm1, Smm3,…}
…...
Determine the Interest o f Area High or Low Vegetation
covered
1
According to the Interest o f Area and Time to get NDVI
data o f M ODIS
H igh/Low-vegetation coveredsmm1>smm2>...
3
2
The Ranking Rules o f Sensors
4
Determine the Difference between the Set of Soil M oisture M odel of one
sensor and that of other one
6
5
Sensor1-M and Sensor2-M have the same type o f
model
Sensor1-M Contains
Sensor2-M
Sensor1-M and Sensor2-M have same number o f
models, but d ifferent types
Sensor1-M and Sensor2-M have different number
and types o f models
The Comparison of spatial reso lution between the two
sensors
Observation Capab ilityOCSensor1-M >OCSensor2-M
According to the merits o f M odels about Soil M o isture to
Determine the Ranking
The Ranking of Sensors according to the observation capability OCSensor1-M >OCSsensor3-M >OCSsensor2-M >…...
TABLE II. TABLE STYLES
The types of models
The name of models
The type of band Vegetation-covered
NIR RED BLUE other
High Low
Based on vegetation index
SVI √ √ √ RVI √ √ √ NDVI √ √
√
VCI √ √ √
XTNDVI
√ √ √
EVI √ √ √ NDWI √ Near
infrared shortwave
√
Based on surface temperature
P full-wave band Albedo √ ATI √ √ √ √ TCI
√
NDTI √ Based on surface temperature and vegetation index
VTCI √ √ √ TVDI √ √
√ √
VITT √ √ √ VSWI √ √ √ √
B. Results As a result, sensors in accordance with the assessment
results of observation capabilities were sorted. It matches the results of actual observation through soil moisture inversion. We execute this method in the application of better selection based on available sensors. The more accuracy matchmaking between sensors and inversion models of soil moisture will return to the users. Unlike existing approaches which use rule-based reasoning to infer knowledge from sensor observations, our approach allows to make inferences about the sensors and its related application. Similarly organizations and researches may re-use our inference rules in order to compare sensors according to observation capabilities.
C. Discussions In this subsection, we compare the query with the rules
(Query-R) to the one without the rules(Query-NR). The two queries adopt the same data sources.
1) Time Required Fig. 6 shows that more CPU time is required to execute
retrieval in Query-R than using in Query-NR. This difference is due mainly to the time required of processing the rules. From the total mean response time, the Query-R is about 2.890
seconds when there are 50 sensors, and the Query –NR is about 1.362 seconds.
Fig. 6. The Map of Soil moisture Sensor Web in Baoxie
2) Precision Precision measurements are employed to assess the
matching quality of information retrieval. Precision is the fraction of retrieved instances that are relevant. Fig. 7 shows that the Query-R yields the good precision, compared with the sensors which could monitor soil moisture.
Fig. 7. The Map of Soil moisture Sensor Web in Baoxie
3) Ranking Besides that, we compare the results of ranking of Query-R
with that of statistical analysis. The result is consistent with the actual situation, as shown in Fig. 8. Due to the operational problems in the rules, there are two deviations in result. Future work will repair this problem.
Fig. 8. The Results of Comparison
V. CONCLUSION In this paper, the ontological representation of sensors
together with inference allows for finding out the better observation of soil moisture from a set of sensors. Future work should make more additional conditions in reason and experiment SAR satellite sensors.
ACKNOWLEDGMENT Supported by the National Basic Research Program of
China (973 Program) under Grant 2011CB707101, by the National High Technology Research and Development Program of China (863 Program) under Grant 2013AA01A608, by the National Nature Science Foundation of China program under Grant 41171315, by the Program for New Century Excellent Talents in University under Grant NCET-11-0394, and by the Fundamental Research Funds for the Central Universities under Grant 2012619020203.
REFERENCES [1] M.Calder, R.A.Morris, F.Peri, “Machine reasoning about anomalous
sensor data”. Ecological informatcis, vol. 5, pp.9-18, 2010 [2] S.Esswein,. “Towards ontology-based data quality inference in large-
scale sensor networks”. In: Proceedings of 12th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing, IEEE Computer Society Washington, DC, USA, , 2012, pp.898-903
[3] S.Fernandez, I.Marsa-Maestre, J.R. Velasco, and B.Alarcos, “Ontology alignment architecture for semantic sensor web integration”. Sensors, vol. 13, pp. 12581-12604, 2013.
[4] M.Koubarakis, and K.Kyzirakos, “Modeling and Querying Metadata in the Semantic Sensor Web: The Model stRDF and the Query Language stSPARQL”. L. Aroyo et al. (Eds.): ESWC 2010, Part I, LNCS 6088, 2010, pp. 425–439
[5] K.Thirunarayan, A.C.Henson, A.P.Sheth, “Situation Awareness via Abductive Reasoning from Semantic Sensor Data: A Preliminary Report”. In: Proceedings of 2009 International Symposium on Collaborative Technologies and Systems (CTS 2009), May 2009. pp. 111-118, 18-22.
[6] G.D.Mel, et al. Service-oriented Reasoning Architecture for Resource-Task Assignment in Sensor Networks. ACITA2011, September 27-28. 2011
[7] S.Esswein, et al., “Towards Ontology-based Data Quality Inference in Large-Scale Sensor Networks”. 2012 12th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing. 2012. pp. 898-903
[8] P. Anantharam, A. Smith, J. Pschorr, K. Thirunarayan, and A. Sheth. Demonstration : Dynamic Sensor Registration and Semantic Processing for ad-hoc MOBile Environments ( SemMOB ), Proceedings of the 5th International Workshop on Semantic Sensor Networks, SSN12, Boston, Massachusetts, USA, November 2012.pp: 127-130
[9] A.Ahmed, Y.zhang, S.Nichols, “Review and evaluation of remote sensing methods for soil-moisture estimation”. SPIE Reviews.2 (1) ,2011.18 pages
[10] S.Natali, et al., “Estimating soil moisture using optical and radar satellite remote sensing data”. A. Marini and M. Talbi (eds.),Desertification and Risk Analysis Using High and Medium Resolution Satellite Data, 2009.pp.105-116
[11] J.Qin, et al.. “Spatial upscaling of in-situ soil moisture measurements based on MODIS-derived apparent thermal inertia”. Remote Sensing of Environment. Vol.138, pp.1-9, 2013.
[12] J.Wang, et al., “Soil moisture and vegetation water conten estimation using two drought monitoring index”. Remote Sensing, Environment and Transportation Engineering (RSETE), 2011 International Conference. 24-26 June 2011. pp.4411 - 4414
[13] N.Chen, C.Hu, “A sharable and interoperable model for atmospheric satellite sensors and observations“. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. 5(5), pp.1519-1530, 2012.
[14] M.Compton, et al.“The SSN ontology of the W3C semantic sensor network incubator group”. Web Semantics:Science, Services and Agents on the World Wide Web.vol. 17, pp.25-32, 2012
[15] World Meteorological Organization (WMO). “List of all Instruments”. http://www.wmo-sat.info/oscar/instruments. Accessed on 2 July, 2014.